I am newbie to Google Cloud,I have below files in GCS, need to design a dataflow to merge the files and replace values from product,location files and load the final output file to BigQuery.
- gs://testprojectxxxx/staging/actual_file.csv
- gs://testprojectxxxx/staging_timestamp/product.csv /location.csv
Python code on local machine:
import pandas as pd
df1 = pd.read_csv("C:/Users/xxxx\\actual_file.csv")
df2 = pd.read_csv("C:/Users/xxxx_folder\\product.csv",header=None,names=['id', 'product_name'])
df3 = pd.merge(df1, df2, how='left', left_on='product_id', right_on='id')
df3.drop(['product_id_x', 'id'], axis=1,inplace=True)
df4 = pd.read_csv("C:/Users/xxxx_folder\\location.csv",header=None,names=['id', 'location_name'])
df5 = pd.merge(df3, df4, how='left', left_on='location_id', right_on='id')
df5.drop(['location_id_x', 'id'], axis=1,inplace=True)
df5.rename(columns={'product_name_y':'product_name','location_name_y':'location'}, inplace=True)
df5.to_csv('Final_file.csv', sep=',',encoding='utf-8', index=False)
Appreciate your help.
To join these rows you want to use
GroupByKey
orCoGroupByKey
https://beam.apache.org/releases/pydoc/2.8.0/apache_beam.transforms.core.html#apache_beam.transforms.core.GroupByKey
Check out Section 4.2.3 in the docs https://beam.apache.org/documentation/programming-guide/#core-beam-transforms