I'm working on a project with three Google Cloud projects (A, B, and C), each having cloud functions deployed. I've set GOOGLE_APPLICATION_CREDENTIALS in my Docker container, but I'm unsure how to configure it to access functions(and databases) from all three projects without using separate keys.
Is there a way to achieve this setup and handle authentication for multiple projects within the same Docker environment without relying on individual keys for each project? Any insights or best practices would be appreciated!
As @DazWilkin mentioned service accounts and workload identity federations are best solutions for production environments.
Using only Service Accounts :
1. One service account can be created and granted the required permissions to access all three cloud services (A, B, and C). The service account can then be granted the appropriate roles (like roles/cloud functions.invoker & roles/cloud sql.client) or IAM policies required for each function in its separate project. You can use the gcloud CLI or Google Cloud Console to do this. Refer to this cloudquery blog by Mike Elsmore for more details.
2. Next, use specialized tools like Google Cloud Secret Manager to safely store the service account key file inside the Docker container. It will support safe secret management.
3. Utilize the service account credentials within your Docker container to verify identity and gain access to project functionalities. You can mount the secret inside the container or use environment variables to access the credits.
Refer to this official doc for more information about Roles for service account authentication and see this official doc about Granting service accounts access to your projects
Using both Service Accounts & workload identity federations :
Set up a service account and IAM policies are automatic if your container is running on GCP and has access to workload identity. See this similar GCP community issue on Configure workload identity federation with kubernetes for 2 google projects for more information.