I would like to deploy the java petstore for kubernetes. In order to achieve this I have 2 simple deployments. The first one is the java web app and the second one is a MySQL database.
When istio is disabled the connection between the app and the DB works well.
Unfortunatly when the istio sidecar is injected the communication between the two stops working.
Here is the deployment file of the web app:
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: jpetstoreweb
spec:
replicas: 1
template:
metadata:
labels:
app: jpetstoreweb
annotations:
sidecar.istio.io/inject: "true"
spec:
containers:
- name: jpetstoreweb
image: wingardiumleviosa/petstore:v7
env:
- name: VERSION
value: "1"
- name: DB_URL
value: "jpetstoredb-service"
- name: DB_PORT
value: "3306"
- name: DB_NAME
value: "jpetstore"
- name: DB_USERNAME
value: "jpetstore"
- name: DB_PASSWORD
value: "foobar"
ports:
- containerPort: 9080
readinessProbe:
httpGet:
path: /
port: 9080
initialDelaySeconds: 10
periodSeconds: 5
---
apiVersion: v1
kind: Service
metadata:
name: jpetstoreweb-service
spec:
selector:
app: jpetstoreweb
ports:
- port: 80
targetPort: 9080
---
And next the deployment file of the mySql database :
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: jpetstoredb
spec:
replicas: 1
template:
metadata:
labels:
app: jpetstoredb
annotations:
sidecar.istio.io/inject: "true"
spec:
containers:
- name: jpetstoredb
image: wingardiumleviosa/petstoredb:v1
ports:
- containerPort: 3306
env:
- name: MYSQL_ROOT_PASSWORD
value: "foobar"
- name: MYSQL_DATABASE
value: "jpetstore"
- name: MYSQL_USER
value: "jpetstore"
- name: MYSQL_PASSWORD
value: "foobar"
---
apiVersion: v1
kind: Service
metadata:
name: jpetstoredb-service
spec:
selector:
app: jpetstoredb
ports:
- port: 3306
targetPort: 3306
Finally the error logs from the web app trying to connect to the DB :
Exception thrown by application class 'org.springframework.web.servlet.FrameworkServlet.processRequest:488'
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.springframework.transaction.CannotCreateTransactionException: Could not open JDBC Connection for transaction; nested exception is java.sql.SQLException: Communication link failure: java.io.EOFException, underlying cause: null ** BEGIN NESTED EXCEPTION ** java.io.EOFException STACKTRACE: java.io.EOFException at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:1395) at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:1539) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1930) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1168) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1279) at com.mysql.jdbc.MysqlIO.sqlQuery(MysqlIO.java:1225) at com.mysql.jdbc.Connection.execSQL(Connection.java:2278) at com.mysql.jdbc.Connection.execSQL(Connection.java:2237) at com.mysql.jdbc.Connection.execSQL(Connection.java:2218) at com.mysql.jdbc.Connection.setAutoCommit(Connection.java:548) at org.apache.commons.dbcp.DelegatingConnection.setAutoCommit(DelegatingConnection.java:331) at org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.setAutoCommit(PoolingDataSource.java:317) at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:221) at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:350) at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:261) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:101) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:89) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at com.sun.proxy.$Proxy28.getCategory(Unknown Source) at org.springframework.samples.jpetstore.web.spring.ViewCategoryController.handleRequest(ViewCategoryController.java:31) at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:48) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:874) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:808) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:476) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:431) at javax.servlet.http.HttpServlet.service(HttpServlet.java:687) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1255) at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:743) at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:440) at com.ibm.ws.webcontainer.filter.WebAppFilterChain.invokeTarget(WebAppFilterChain.java:182) at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:93) at com.ibm.ws.security.jaspi.JaspiServletFilter.doFilter(JaspiServletFilter.java:56) at com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:201) at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:90) at com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:996) at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1134) at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1005) at com.ibm.ws.webcontainer.servlet.CacheServletWrapper.handleRequest(CacheServletWrapper.java:75) at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:927) at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost$2.run(DynamicVirtualHost.java:279) at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink$TaskWrapper.run(HttpDispatcherLink.java:1023) at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink.wrapHandlerAndExecute(HttpDispatcherLink.java:417) at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink.ready(HttpDispatcherLink.java:376) at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:532) at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.handleNewRequest(HttpInboundLink.java:466) at com.ibm.ws.http.channel.internal.inbound.HttpInboundLink.processRequest(HttpInboundLink.java:331) at com.ibm.ws.http.channel.internal.inbound.HttpICLReadCallback.complete(HttpICLReadCallback.java:70) at com.ibm.ws.tcpchannel.internal.WorkQueueManager.requestComplete(WorkQueueManager.java:501) at com.ibm.ws.tcpchannel.internal.WorkQueueManager.attemptIO(WorkQueueManager.java:571) at com.ibm.ws.tcpchannel.internal.WorkQueueManager.workerRun(WorkQueueManager.java:926) at com.ibm.ws.tcpchannel.internal.WorkQueueManager$Worker.run(WorkQueueManager.java:1015) at com.ibm.ws.threading.internal.ExecutorServiceImpl$RunnableWrapper.run(ExecutorServiceImpl.java:232) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1160) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.lang.Thread.run(Thread.java:812) ** END NESTED EXCEPTION **
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:488)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:431)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
Extract : Could not open JDBC Connection for transaction
Additionnal info :
1) I can curl the DB from the web app container using CURL and it answers correctly.
2) I use Cilium instead of Calico
3) I installed Istio using HELM
4) Kubernetes is installed on bare metal (no cloud provider)
5) kubectl get pods -n istio-system
all istio pods are running
6) kubectl get pods -n kube-system
all cilium pods are running
7) Istio is injected using kubectl apply -f <(~/istio-1.0.5/bin/istioctl kube-inject -f ~/jpetstore.yaml) -n foo
. If I use any other method Istio is not injecting itself in the Web pod (But works for the DB pod, god knows why)
8) The DB pod is always happy and working well
9) Logs of the istio-proxy container inside the WebApp pod : kubectl logs jpetstoreweb-84c7d8964-s642k istio-proxy -n myns
2018-12-28T03:52:30.610101Z info Version root@6f6ea1061f2b-docker.io/istio-1.0.5-c1707e45e71c75d74bf3a5dec8c7086f32f32fad-Clean
2018-12-28T03:52:30.610167Z info Proxy role: model.Proxy{ClusterID:"", Type:"sidecar", IPAddress:"10.233.72.142", ID:"jpetstoreweb-84c7d8964-s642k.myns", Domain:"myns.svc.cluster.local", Metadata:map[string]string(nil)}
2018-12-28T03:52:30.611217Z info Effective config: binaryPath: /usr/local/bin/envoy
configPath: /etc/istio/proxy
connectTimeout: 10s
discoveryAddress: istio-pilot.istio-system:15007
discoveryRefreshDelay: 1s
drainDuration: 45s
parentShutdownDuration: 60s
proxyAdminPort: 15000
serviceCluster: jpetstoreweb
zipkinAddress: zipkin.istio-system:9411
2018-12-28T03:52:30.611249Z info Monitored certs: []envoy.CertSource{envoy.CertSource{Directory:"/etc/certs/", Files:[]string{"cert-chain.pem", "key.pem", "root-cert.pem"}}}
2018-12-28T03:52:30.611829Z info Starting proxy agent
2018-12-28T03:52:30.611902Z info Received new config, resetting budget
2018-12-28T03:52:30.611912Z info Reconciling configuration (budget 10)
2018-12-28T03:52:30.611926Z info Epoch 0 starting
2018-12-28T03:52:30.613236Z info Envoy command: [-c /etc/istio/proxy/envoy-rev0.json --restart-epoch 0 --drain-time-s 45 --parent-shutdown-time-s 60 --service-cluster jpetstoreweb --service-node sidecar~10.233.72.142~jpetstoreweb-84c7d8964-s642k.myns~myns.svc.cluster.local --max-obj-name-len 189 --allow-unknown-fields -l warn --v2-config-only]
[2018-12-28 03:52:30.630][20][info][main] external/envoy/source/server/server.cc:190] initializing epoch 0 (hot restart version=10.200.16384.256.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=4882536)
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:192] statically linked extensions:
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:194] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:197] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,istio_authn,jwt-auth,mixer
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:200] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:203] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.rbac,envoy.filters.network.thrift_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,mixer
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:205] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:207] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:210] transport_sockets.downstream: alts,envoy.transport_sockets.capture,raw_buffer,tls
[2018-12-28 03:52:30.631][20][info][main] external/envoy/source/server/server.cc:213] transport_sockets.upstream: alts,envoy.transport_sockets.capture,raw_buffer,tls
[2018-12-28 03:52:30.634][20][info][config] external/envoy/source/server/configuration_impl.cc:50] loading 0 static secret(s)
[2018-12-28 03:52:30.638][20][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:240] gRPC config stream closed: 14, no healthy upstream
[2018-12-28 03:52:30.638][20][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:41] Unable to establish new stream
[2018-12-28 03:52:30.638][20][info][config] external/envoy/source/server/configuration_impl.cc:60] loading 1 listener(s)
[2018-12-28 03:52:30.640][20][info][config] external/envoy/source/server/configuration_impl.cc:94] loading tracing configuration
[2018-12-28 03:52:30.640][20][info][config] external/envoy/source/server/configuration_impl.cc:103] loading tracing driver: envoy.zipkin
[2018-12-28 03:52:30.640][20][info][config] external/envoy/source/server/configuration_impl.cc:116] loading stats sink configuration
[2018-12-28 03:52:30.640][20][info][main] external/envoy/source/server/server.cc:432] starting main dispatch loop
[2018-12-28 03:52:32.010][20][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:240] gRPC config stream closed: 14, no healthy upstream
[2018-12-28 03:52:32.011][20][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:41] Unable to establish new stream
[2018-12-28 03:52:34.691][20][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:240] gRPC config stream closed: 14, no healthy upstream
[2018-12-28 03:52:34.691][20][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:41] Unable to establish new stream
[2018-12-28 03:52:38.483][20][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:130] cm init: initializing cds
[2018-12-28 03:53:01.596][20][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:494] add/update cluster outbound|443||kubernetes.default.svc.cluster.local during init
...
[2018-12-28T04:09:09.561Z] - 115 1548 6 "127.0.0.1:9080" inbound|80||jpetstoreweb-service.myns.svc.cluster.local 127.0.0.1:40318 10.233.72.142:9080 10.233.72.1:43098
[2018-12-28T04:09:14.555Z] - 115 1548 8 "127.0.0.1:9080" inbound|80||jpetstoreweb-service.myns.svc.cluster.local 127.0.0.1:40350 10.233.72.142:9080 10.233.72.1:43130
[2018-12-28T04:09:19.556Z] - 115 1548 5 "127.0.0.1:9080" inbound|80||jpetstoreweb-service.myns.svc.cluster.local 127.0.0.1:40364 10.233.72.142:9080 10.233.72.1:43144
[2018-12-28T04:09:24.558Z] - 115 1548 6 "127.0.0.1:9080" inbound|80||jpetstoreweb-service.myns.svc.cluster.local 127.0.0.1:40378 10.233.72.142:9080 10.233.72.1:43158
10) Using Istio 1.0.5 and kubernetes 1.13.0
All idears are welcome ;-)
Thx
So there really is an issue with Istio 1.0.5 and the JDBC of MySQL
The temporary solution is to delete the mesh ressource in the following way :
kubectl delete meshpolicies.authentication.istio.io default
As stated here and referencing this.
(FYI : I deleted the ressource BEFORE deploying my petstore app.)
As of Istio 1.1.1 there is more data on this problem in the FAQ