kubeflow - ml pipeline cannot connect to mysql pod

2.1k Views Asked by At

environment : https://www.ncloud.com, nks ( same as eks, managed k8s cloud service )

After install istio, I install kubeflow using kubeflow manifests (https://www.kubeflow.org/docs/started/k8s/kfctl-k8s-istio/#alternatively-set-up-your-configuration-for-later-deployment)

notebook works good, but ml-pipeline Liveness health check failed. ml-pipeline pod keep restarting as below

$ kubectl get po -n kubeflow
NAME                                                     READY   STATUS    RESTARTS   AGE
admission-webhook-bootstrap-stateful-set-0               1/1     Running   0          5h48m
admission-webhook-deployment-5bc5f97cfd-l4fhn            1/1     Running   0          5h46m
application-controller-stateful-set-0                    1/1     Running   0          5h49m
argo-ui-669bcd8bfc-qhwmh                                 1/1     Running   0          5h47m
cache-deployer-deployment-b75f5c5f6-rmg4w                2/2     Running   1          5h47m
cache-server-85bccd99bd-l59pb                            2/2     Running   23         5h47m
centraldashboard-68965b5d89-gs2tp                        1/1     Running   0          5h47m
jupyter-web-app-deployment-5dfbb68956-jqb9n              1/1     Running   0          5h47m
katib-controller-76b78f5db-fdggt                         1/1     Running   1          5h47m
katib-db-manager-67c9554b6d-tnppk                        1/1     Running   2          5h47m
katib-mysql-7b9d7b44dc-v7flp                             1/1     Running   0          5h47m
katib-ui-844b4fc655-tr2g7                                1/1     Running   0          5h47m
kfserving-controller-manager-0                           2/2     Running   0          5h47m
kubeflow-pipelines-profile-controller-65b65d97bb-72n4j   1/1     Running   0          5h47m
metacontroller-0                                         1/1     Running   0          5h48m
metadata-db-695fb6f55-sz2q7                              1/1     Running   0          5h47m
metadata-deployment-7d77b884b6-tqshz                     1/1     Running   0          5h47m
metadata-envoy-deployment-c5985d64b-w465p                1/1     Running   0          5h47m
metadata-grpc-deployment-9fdb476-jcmgn                   1/1     Running   0          5h47m
metadata-ui-cf67fdb48-t5x9f                              1/1     Running   0          5h47m
metadata-writer-59d755696c-j9nkb                         2/2     Running   0          5h47m
minio-6647564c5c-g2ps8                                   1/1     Running   0          5h47m
ml-pipeline-6bc56cd86d-xkkp7                             1/2     Running   42         5h47m
ml-pipeline-persistenceagent-6f99b56974-pgcff            2/2     Running   0          5h47m
ml-pipeline-scheduledworkflow-d596b8bd-5nsqc             2/2     Running   0          5h47m
ml-pipeline-ui-8695cc6b46-r4lfq                          2/2     Running   0          5h47m
ml-pipeline-viewer-crd-5998ff7f56-tfb2s                  2/2     Running   1          5h47m
ml-pipeline-visualizationserver-cbbb5b5b-jp7mp           2/2     Running   0          5h47m
mpi-operator-c747f5bf6-h8ntv                             1/1     Running   0          5h47m
mxnet-operator-7cd59d475-vrb64                           1/1     Running   0          5h47m
mysql-5f74cfbcc7-xn5ps                                   2/2     Running   0          5h47m
notebook-controller-deployment-756587d86-q5ph9           1/1     Running   0          5h47m
profiles-deployment-65fcc9c97-dmtv6                      2/2     Running   0          5h47m
pytorch-operator-5db58565b-g9m9x                         1/1     Running   0          5h47m
seldon-controller-manager-6ddf664d54-h9kff               1/1     Running   0          5h47m
spark-operatorsparkoperator-85bbf89886-4wrbc             1/1     Running   0          5h48m
spartakus-volunteer-7566cfd658-czqb7                     1/1     Running   0          5h47m
tf-job-operator-5bf84768bf-ps9nl                         1/1     Running   0          5h47m
workflow-controller-54dccb7dc4-ckpkp                     1/1     Running   0          5h47m

I have several experment.
experiment 1) istio endpoint check mysql pod is istio injected. so, I check istio-proxy - i think it is healty..

$ istioctl proxy-config endpoints -n kubeflow mysql-5f74cfbcc7-xn5ps.kubeflow --cluster "inbound|3306||mysql.kubeflow.svc.cluster.local"
ENDPOINT           STATUS      OUTLIER CHECK     CLUSTER
127.0.0.1:3306     HEALTHY     OK                inbound|3306||mysql.kubeflow.svc.cluster.local

experiment 2) connect to mysql-db inside ml-pipeline pod

  • use kubectl exec -it, i connect to ml-pipeline pod and install mysql-client, but cannot connect to mysql db,(but, nslookup mysql.kubeflow.svc.cluster.local return cluster ip)

experiment 3) connect to mysql pod and connect mysql db

  • success. but no database made. like below
$ kubectl -n kubeflow exec -it mysql-5f74cfbcc7-xn5ps -c mysql bash
root@mysql-5f74cfbcc7-xn5ps:/# mysql
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 1659
Server version: 5.6.44 MySQL Community Server (GPL)

Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| mysql              |
| performance_schema |
+--------------------+
3 rows in set (0.00 sec)

experiment 4) ml-pipeline logs

$ kubectl -n kubeflow logs ml-pipeline-85f5d4b6ff-wgnz9 -c ml-pipeline-api-server
I1204 00:28:16.450558       6 client_manager.go:134] Initializing client manager
I1204 00:28:16.450678       6 config.go:51] Config DBConfig.ExtraParams not specified, skipping
[mysql] 2020/12/04 00:28:19 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:28:23 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:28:26 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:28:32 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:28:38 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:28:47 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:29:00 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:29:12 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:29:43 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:30:07 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:30:55 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:32:09 packets.go:36: unexpected EOF
[mysql] 2020/12/04 00:33:01 packets.go:36: unexpected EOF

add::

$ istioctl proxy-config endpoints -n kubeflow ml-pipeline-6bc56cd86d-xkkp7
ENDPOINT                        STATUS      OUTLIER CHECK     CLUSTER
127.0.0.1:8887                  HEALTHY     OK                inbound|8887|grpc|ml-pipeline.kubeflow.svc.cluster.local
127.0.0.1:8888                  HEALTHY     OK                inbound|8888|http|ml-pipeline.kubeflow.svc.cluster.local
127.0.0.1:15000                 HEALTHY     OK                prometheus_stats
127.0.0.1:15020                 HEALTHY     OK                inbound|15020|mgmt-15020|mgmtCluster
142.250.199.74:443              HEALTHY     OK                outbound|443||www.googleapis.com
142.250.199.80:443              HEALTHY     OK                outbound|443||storage.googleapis.com
172.217.161.138:443             HEALTHY     OK                outbound|443||www.googleapis.com
172.217.163.234:443             HEALTHY     OK                outbound|443||www.googleapis.com
172.217.174.202:443             HEALTHY     OK                outbound|443||www.googleapis.com
172.217.25.16:443               HEALTHY     OK                outbound|443||storage.googleapis.com
172.217.26.138:443              HEALTHY     OK                outbound|443||www.googleapis.com
172.217.31.234:443              HEALTHY     OK                outbound|443||www.googleapis.com
172.217.31.240:443              HEALTHY     OK                outbound|443||storage.googleapis.com
192.168.10.0:4443               HEALTHY     OK                outbound|443||metrics-server.kube-system.svc.cluster.local
192.168.10.0:6443               HEALTHY     OK                outbound|443||kubernetes.default.svc.cluster.local
192.168.10.1:4443               HEALTHY     OK                outbound|443||metrics-server.kube-system.svc.cluster.local
192.168.10.1:6443               HEALTHY     OK                outbound|443||kubernetes.default.svc.cluster.local
192.168.10.2:4443               HEALTHY     OK                outbound|443||metrics-server.kube-system.svc.cluster.local
192.168.10.2:6443               HEALTHY     OK                outbound|443||kubernetes.default.svc.cluster.local
198.18.0.10:9090                HEALTHY     OK                outbound|9090||prometheus.istio-system.svc.cluster.local
198.18.0.101:8080               HEALTHY     OK                outbound|8080||istio-pilot.istio-system.svc.cluster.local
198.18.0.101:15010              HEALTHY     OK                outbound|15010||istio-pilot.istio-system.svc.cluster.local
198.18.0.101:15011              HEALTHY     OK                outbound|15011||istio-pilot.istio-system.svc.cluster.local
198.18.0.101:15012              HEALTHY     OK                outbound|15012||istio-pilot.istio-system.svc.cluster.local
198.18.0.101:15012              HEALTHY     OK                outbound|15012||istiod.istio-system.svc.cluster.local
198.18.0.101:15014              HEALTHY     OK                outbound|15014||istio-pilot.istio-system.svc.cluster.local
198.18.0.101:15017              HEALTHY     OK                outbound|443||istio-pilot.istio-system.svc.cluster.local
198.18.0.101:15017              HEALTHY     OK                outbound|443||istiod.istio-system.svc.cluster.local
198.18.0.102:8080               HEALTHY     OK                outbound|8080||metadata-grpc-service.kubeflow.svc.cluster.local
198.18.0.107:80                 HEALTHY     OK                outbound|80||kubeflow-pipelines-profile-controller.kubeflow.svc.cluster.local
198.18.0.127:3000               HEALTHY     OK                outbound|3000||grafana.istio-system.svc.cluster.local
198.18.0.13:9402                HEALTHY     OK                outbound|9402||cert-manager.cert-manager.svc.cluster.local
198.18.0.132:8080               HEALTHY     OK                outbound|8080||katib-controller.kubeflow.svc.cluster.local
198.18.0.132:8443               HEALTHY     OK                outbound|443||katib-controller.kubeflow.svc.cluster.local
198.18.0.200:9411               HEALTHY     OK                outbound|9411||zipkin.istio-system.svc.cluster.local
198.18.0.200:14250              HEALTHY     OK                outbound|14250||jaeger-collector.istio-system.svc.cluster.local
198.18.0.200:14267              HEALTHY     OK                outbound|14267||jaeger-collector.istio-system.svc.cluster.local
198.18.0.200:14268              HEALTHY     OK                outbound|14268||jaeger-collector.istio-system.svc.cluster.local
198.18.0.200:16686              HEALTHY     OK                outbound|16686||jaeger-query.istio-system.svc.cluster.local
198.18.0.200:16686              HEALTHY     OK                outbound|80||tracing.istio-system.svc.cluster.local
198.18.0.201:443                HEALTHY     OK                outbound|443||application-controller-service.kubeflow.svc.cluster.local
198.18.0.208:80                 HEALTHY     OK                outbound|80||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:443                HEALTHY     OK                outbound|443||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:15020              HEALTHY     OK                outbound|15020||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:15029              HEALTHY     OK                outbound|15029||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:15030              HEALTHY     OK                outbound|15030||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:15031              HEALTHY     OK                outbound|15031||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:15032              HEALTHY     OK                outbound|15032||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:15443              HEALTHY     OK                outbound|15443||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.208:31400              HEALTHY     OK                outbound|31400||istio-ingressgateway.istio-system.svc.cluster.local
198.18.0.212:8080               HEALTHY     OK                outbound|80||katib-ui.kubeflow.svc.cluster.local
198.18.0.227:80                 HEALTHY     OK                outbound|80||istio-egressgateway.istio-system.svc.cluster.local
198.18.0.227:443                HEALTHY     OK                outbound|443||istio-egressgateway.istio-system.svc.cluster.local
198.18.0.227:15443              HEALTHY     OK                outbound|15443||istio-egressgateway.istio-system.svc.cluster.local
198.18.0.232:8080               HEALTHY     OK                outbound|8080||metadata-service.kubeflow.svc.cluster.local
198.18.0.252:8082               HEALTHY     OK                outbound|80||centraldashboard.kubeflow.svc.cluster.local
198.18.0.59:3000                HEALTHY     OK                outbound|80||metadata-ui.kubeflow.svc.cluster.local
198.18.0.65:8001                HEALTHY     OK                outbound|80||argo-ui.kubeflow.svc.cluster.local
198.18.0.67:8012                HEALTHY     OK                outbound|80||activator-service.knative-serving.svc.cluster.local
198.18.0.67:8013                HEALTHY     OK                outbound|81||activator-service.knative-serving.svc.cluster.local
198.18.0.67:9090                HEALTHY     OK                outbound|9090||activator-service.knative-serving.svc.cluster.local
198.18.0.77:5000                HEALTHY     OK                outbound|80||jupyter-web-app-service.kubeflow.svc.cluster.local
198.18.0.8:20001                HEALTHY     OK                outbound|20001||kiali.istio-system.svc.cluster.local
198.18.0.83:6443                HEALTHY     OK                outbound|443||cert-manager-webhook.cert-manager.svc.cluster.local
198.18.0.91:8443                HEALTHY     OK                outbound|443||webhook.knative-serving.svc.cluster.local
198.18.0.99:3000                HEALTHY     OK                outbound|80||ml-pipeline-ui.kubeflow.svc.cluster.local
198.18.1.106:3306               HEALTHY     OK                outbound|3306||mysql.kubeflow.svc.cluster.local
198.18.1.111:8443               HEALTHY     OK                outbound|8443||pytorch-operator.kubeflow.svc.cluster.local
198.18.1.122:8888               HEALTHY     OK                outbound|8888||ml-pipeline-visualizationserver.kubeflow.svc.cluster.local
198.18.1.131:8443               HEALTHY     OK                outbound|8443||tf-job-operator.kubeflow.svc.cluster.local
198.18.1.135:8081               HEALTHY     OK                outbound|8081||profiles-kfam.kubeflow.svc.cluster.local
198.18.1.146:6789               HEALTHY     OK                outbound|6789||katib-db-manager.kubeflow.svc.cluster.local
198.18.1.147:443                HEALTHY     OK                outbound|443||notebook-controller-service.kubeflow.svc.cluster.local
198.18.1.161:53                 HEALTHY     OK                outbound|53||coredns.kube-system.svc.cluster.local
198.18.1.161:9153               HEALTHY     OK                outbound|9153||coredns.kube-system.svc.cluster.local
198.18.1.168:80                 HEALTHY     OK                outbound|80||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:443                HEALTHY     OK                outbound|443||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:8060               HEALTHY     OK                outbound|8060||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:15011              HEALTHY     OK                outbound|15011||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:15029              HEALTHY     OK                outbound|15029||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:15030              HEALTHY     OK                outbound|15030||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:15031              HEALTHY     OK                outbound|15031||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:15032              HEALTHY     OK                outbound|15032||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.168:31400              HEALTHY     OK                outbound|31400||cluster-local-gateway.istio-system.svc.cluster.local
198.18.1.177:9000               HEALTHY     OK                outbound|9000||minio-service.kubeflow.svc.cluster.local
198.18.1.244:443                HEALTHY     OK                outbound|443||seldon-webhook-service.kubeflow.svc.cluster.local
198.18.1.94:9090                HEALTHY     OK                outbound|9090||metadata-envoy-service.kubeflow.svc.cluster.local
198.18.2.133:443                HEALTHY     OK                outbound|443||kfserving-controller-manager-service.kubeflow.svc.cluster.local
198.18.2.133:443                HEALTHY     OK                outbound|443||kfserving-webhook-server-service.kubeflow.svc.cluster.local
198.18.2.158:8443               HEALTHY     OK                outbound|443||cache-server.kubeflow.svc.cluster.local
198.18.2.184:9090               HEALTHY     OK                outbound|9090||controller.knative-serving.svc.cluster.local
198.18.2.195:8080               HEALTHY     OK                outbound|8080||autoscaler.knative-serving.svc.cluster.local
198.18.2.195:8443               HEALTHY     OK                outbound|443||autoscaler.knative-serving.svc.cluster.local
198.18.2.195:9090               HEALTHY     OK                outbound|9090||autoscaler.knative-serving.svc.cluster.local
198.18.2.233:3306               HEALTHY     OK                outbound|3306||metadata-db.kubeflow.svc.cluster.local
198.18.2.6:53                   HEALTHY     OK                outbound|53||coredns.kube-system.svc.cluster.local
198.18.2.6:9153                 HEALTHY     OK                outbound|9153||coredns.kube-system.svc.cluster.local
198.18.2.62:3306                HEALTHY     OK                outbound|3306||katib-mysql.kubeflow.svc.cluster.local
198.18.2.76:443                 HEALTHY     OK                outbound|443||admission-webhook-service.kubeflow.svc.cluster.local
198.19.224.127:15012            HEALTHY     OK                xds-grpc
198.19.232.127:9411             HEALTHY     OK                zipkin
216.58.199.16:443               HEALTHY     OK                outbound|443||storage.googleapis.com
216.58.200.10:443               HEALTHY     OK                outbound|443||www.googleapis.com
216.58.200.74:443               HEALTHY     OK                outbound|443||www.googleapis.com
216.58.220.202:443              HEALTHY     OK                outbound|443||www.googleapis.com
216.58.220.208:443              HEALTHY     OK                outbound|443||storage.googleapis.com
216.58.221.234:443              HEALTHY     OK                outbound|443||www.googleapis.com
unix:///etc/istio/proxy/SDS     HEALTHY     OK                sds-grpc

add ::

  1. istio version? 1.5.10
  2. What is the readiness probe(ml-pipeline pod) failed status code? yes, 503
Events:
  Type     Reason     Age                    From                         Message
  ----     ------     ----                   ----                         -------
  Normal   Scheduled  <unknown>              default-scheduler            Successfully assigned kubeflow/ml-pipeline-b8689545c-wh5dd to nks-pool-156-w-3qa
  Normal   Pulled     3m41s                  kubelet, nks-pool-156-w-3qa  Container image "docker.io/istio/proxyv2:1.5.10" already present on machine
  Normal   Created    3m41s                  kubelet, nks-pool-156-w-3qa  Created container istio-init
  Normal   Started    3m41s                  kubelet, nks-pool-156-w-3qa  Started container istio-init
  Normal   Pulled     3m41s                  kubelet, nks-pool-156-w-3qa  Container image "gcr.io/ml-pipeline/api-server:1.0.0" already present on machine
  Normal   Created    3m41s                  kubelet, nks-pool-156-w-3qa  Created container ml-pipeline-api-server
  Normal   Started    3m41s                  kubelet, nks-pool-156-w-3qa  Started container ml-pipeline-api-server
  Normal   Pulled     3m41s                  kubelet, nks-pool-156-w-3qa  Container image "docker.io/istio/proxyv2:1.5.10" already present on machine
  Normal   Created    3m41s                  kubelet, nks-pool-156-w-3qa  Created container istio-proxy
  Normal   Started    3m40s                  kubelet, nks-pool-156-w-3qa  Started container istio-proxy
  Warning  Unhealthy  3m34s (x3 over 3m38s)  kubelet, nks-pool-156-w-3qa  Readiness probe failed: HTTP probe failed with statuscode: 503

3.Anything in the istio init logs?

$ kubectl -n kubeflow logs ml-pipeline-b8689545c-wh5dd -c istio-init
Environment:
------------
ENVOY_PORT=
INBOUND_CAPTURE_PORT=
ISTIO_INBOUND_INTERCEPTION_MODE=
ISTIO_INBOUND_TPROXY_MARK=
ISTIO_INBOUND_TPROXY_ROUTE_TABLE=
ISTIO_INBOUND_PORTS=
ISTIO_LOCAL_EXCLUDE_PORTS=
ISTIO_SERVICE_CIDR=
ISTIO_SERVICE_EXCLUDE_CIDR=

Variables:
----------
PROXY_PORT=15001
PROXY_INBOUND_CAPTURE_PORT=15006
PROXY_UID=1337
PROXY_GID=1337
INBOUND_INTERCEPTION_MODE=REDIRECT
INBOUND_TPROXY_MARK=1337
INBOUND_TPROXY_ROUTE_TABLE=133
INBOUND_PORTS_INCLUDE=*
INBOUND_PORTS_EXCLUDE=15090,15020
OUTBOUND_IP_RANGES_INCLUDE=*
OUTBOUND_IP_RANGES_EXCLUDE=
OUTBOUND_PORTS_EXCLUDE=
KUBEVIRT_INTERFACES=
ENABLE_INBOUND_IPV6=false

Writing following contents to rules file:  /tmp/iptables-rules-1607042716359663422.txt131092074
* nat
-N ISTIO_REDIRECT
-N ISTIO_IN_REDIRECT
-N ISTIO_INBOUND
-N ISTIO_OUTPUT
-A ISTIO_REDIRECT -p tcp -j REDIRECT --to-ports 15001
-A ISTIO_IN_REDIRECT -p tcp -j REDIRECT --to-ports 15006
-A PREROUTING -p tcp -j ISTIO_INBOUND
-A ISTIO_INBOUND -p tcp --dport 22 -j RETURN
-A ISTIO_INBOUND -p tcp --dport 15090 -j RETURN
-A ISTIO_INBOUND -p tcp --dport 15020 -j RETURN
-A ISTIO_INBOUND -p tcp -j ISTIO_IN_REDIRECT
-A OUTPUT -p tcp -j ISTIO_OUTPUT
-A ISTIO_OUTPUT -o lo -s 127.0.0.6/32 -j RETURN
-A ISTIO_OUTPUT -o lo ! -d 127.0.0.1/32 -m owner --uid-owner 1337 -j ISTIO_IN_REDIRECT
-A ISTIO_OUTPUT -o lo -m owner ! --uid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -m owner --uid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -o lo ! -d 127.0.0.1/32 -m owner --gid-owner 1337 -j ISTIO_IN_REDIRECT
-A ISTIO_OUTPUT -o lo -m owner ! --gid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -m owner --gid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -d 127.0.0.1/32 -j RETURN
-A ISTIO_OUTPUT -j ISTIO_REDIRECT
COMMIT

iptables-restore --noflush /tmp/iptables-rules-1607042716359663422.txt131092074
Writing following contents to rules file:  /tmp/ip6tables-rules-1607042716366344859.txt443440321

ip6tables-restore --noflush /tmp/ip6tables-rules-1607042716366344859.txt443440321
iptables-save
# Generated by iptables-save v1.6.1 on Fri Dec  4 00:45:16 2020
*nat
:PREROUTING ACCEPT [0:0]
:INPUT ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:POSTROUTING ACCEPT [0:0]
:ISTIO_INBOUND - [0:0]
:ISTIO_IN_REDIRECT - [0:0]
:ISTIO_OUTPUT - [0:0]
:ISTIO_REDIRECT - [0:0]
-A PREROUTING -p tcp -j ISTIO_INBOUND
-A OUTPUT -p tcp -j ISTIO_OUTPUT
-A ISTIO_INBOUND -p tcp -m tcp --dport 22 -j RETURN
-A ISTIO_INBOUND -p tcp -m tcp --dport 15090 -j RETURN
-A ISTIO_INBOUND -p tcp -m tcp --dport 15020 -j RETURN
-A ISTIO_INBOUND -p tcp -j ISTIO_IN_REDIRECT
-A ISTIO_IN_REDIRECT -p tcp -j REDIRECT --to-ports 15006
-A ISTIO_OUTPUT -s 127.0.0.6/32 -o lo -j RETURN
-A ISTIO_OUTPUT ! -d 127.0.0.1/32 -o lo -m owner --uid-owner 1337 -j ISTIO_IN_REDIRECT
-A ISTIO_OUTPUT -o lo -m owner ! --uid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -m owner --uid-owner 1337 -j RETURN
-A ISTIO_OUTPUT ! -d 127.0.0.1/32 -o lo -m owner --gid-owner 1337 -j ISTIO_IN_REDIRECT
-A ISTIO_OUTPUT -o lo -m owner ! --gid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -m owner --gid-owner 1337 -j RETURN
-A ISTIO_OUTPUT -d 127.0.0.1/32 -j RETURN
-A ISTIO_OUTPUT -j ISTIO_REDIRECT
-A ISTIO_REDIRECT -p tcp -j REDIRECT --to-ports 15001
COMMIT
# Completed on Fri Dec  4 00:45:16 2020
  1. I tried solution in github issue here, but not working

How can i solve this problem? why cannot connect to mysql database?

0

There are 0 best solutions below