GKE and Private Workers Pool on Google Cloud

70 Views Asked by At

I been trying to follow the doc https://cloud.google.com/architecture/accessing-private-gke-clusters-with-cloud-build-private-pools but I can't already, I create all the components like vpc's, routers and worker pool well but the workerpool doesn't reach the private Gke cluster. The workerpool create a instance on 192.168.0.0/16 range and the private cluster on 10.68.0.2 on the other vpc, to probe the connectivity between the VPCs I create VM's on both Vpcs and it had communication , but when the private cloudbuild instance is up and running with the global ip 192.168.0.2 (like the doc says) and try to connect with the cluster doesn't work How do I configure that, what hidden step is missed, firewall roule o something ? I would appreciate your help.

YAML:

  • name: 'gcr.io/google.com/cloudsdktool/cloud-sdk' entrypoint: 'bash' args:
    • '-eEuo'
    • 'pipefail'
    • '-c'
    • |- gcloud container clusters get-credentials GKE-CLUSTER-NAME --zone us-central1-a --project PROJECT kubectl get nodes options: workerPool: 'projects/PROJECT/locations/us-central1/workerPools/private-cloudbuild-pool'

OUTPUT: Step #1: Fetching cluster endpoint and auth data. Step #1: kubeconfig entry generated for GKE-CLUSTER-NAME. Step #1: E1018 20:46:10.807598 28 memcache.go:265] couldn't get current server API group list: Get "https://10.68.0.2/api?timeout=32s": dial tcp 10.68.0.2:443: i/o timeout

0

There are 0 best solutions below