Whats the proper way to proxy a Cloud SQL Database into Bitbucket Pipelines?
I have a Google Cloud SQL Postgres Instance (And also tried a MySQL DB).
Opening all ports to connections allows bitbucket pipelines to properly deploy my Django based Google App Engine Project, based off this example pipeline - https://github.com/GoogleCloudPlatform/continuous-deployment-bitbucket/blob/master/bitbucket-pipelines.yml
However, when I try to limit the access to the Cloud SQL instances and use cloud_sql_proxy instead, I can properly deploy locally, but Bitbucket will always fail to find the SQL Server
My bitbucket-pipelines.yml looks something like this:
- export CLOUDSDK_CORE_DISABLE_PROMPTS=1
# Google Cloud SDK is pinned for build reliability. Bump if the SDK complains about deprecation.
- SDK_VERSION=127.0.0
- SDK_FILENAME=google-cloud-sdk-${SDK_VERSION}-linux-x86_64.tar.gz
- curl -O -J https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/${SDK_FILENAME}
- tar -zxvf ${SDK_FILENAME} --directory ${HOME}
- export PATH=${PATH}:${HOME}/google-cloud-sdk/bin
# Install Google App Engine SDK
- GAE_PYTHONPATH=${HOME}/google_appengine
- export PYTHONPATH=${PYTHONPATH}:${GAE_PYTHONPATH}
- python scripts/fetch_gae_sdk.py $(dirname "${GAE_PYTHONPATH}")
- echo "${PYTHONPATH}" && ls ${GAE_PYTHONPATH}
# Install app & dev dependencies, test, deploy, test deployment
- echo "key = '${GOOGLE_API_KEY}'" > api_key.py
- echo ${GOOGLE_CLIENT_SECRET} > client-secret.json
- gcloud auth activate-service-account --key-file client-secret.json
- wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
- chmod +x cloud_sql_proxy
- ./cloud_sql_proxy -instances=google-cloud-project-name:us-west1:google-cloud-sql-database-name=tcp:5432 &
- gcloud app deploy --no-promote --project google-cloud-project-name --quiet
At this point, I would expect to be able to access the SQL database, but it doesn't seem to be available, and my deployment fails to find a local proxy'ed database