I have two Google projects: dev
and prod
. I import data from also different storage buckets located in these projects: dev-bucket
and prod-bucket
.
After I have made and tested changes in the dev
environment, how can I smoothly apply (deploy/copy) the changes to prod
as well?
What I do now is I export the flow from dev
and then re-import it into prod
. However, each time I need to manually do the following in the `prod flows:
- Change the dataset that serve as inputs in the flow
- Replace the manual and scheduled destinations for the right BigQuery dataset (
dev-dataset-bigquery
andprod-dataset-bigquery
)
How can this be done more smoother?
Follow below procedure for movement from one environment to another using API and for updating the dataset and the output as per new environment.
1)Export a plan
GET
2)Import the plan
Post:
3)Update the input dataset
PUT:
4)Update the output
PATCH