Why is my Pipelinewise ETL not inserting rows into Snowflake?

112 Views Asked by At

I'm using the Pipelinewise ETL utility which implements the Singer.io spec. Data is making it from my postgres instance -> s3 -> Snowflake stage but the data is not being inserted into Snowflake tables. Anyone have any ideas?

I've tried:

  • sync_tables
  • Deleting ~/.pipelinewise and removing the docker volumes for the pipelinewise container
  • Dropping all the Snowflake tables

The tables get recreated, but they're empty...

time=2021-04-07 23:56:09 logger_name=pipelinewise.fastsync.commons.target_snowflake log_level=INFO message=Uploading to S3 bucket: sightly-staging-pipelinewise, local file: /root/.pipelinewise/tmp/pipelinewise_postgres-staging-tap_brand.trend_brand_profile_question_mapping_20210407-235606-434637_fastsync_7NZCNBPA.csv.gz, S3 key: snowflake-imports/pipelinewise_po
stgres-staging-tap_brand.trend_brand_profile_question_mapping_20210407-235606-434637_fastsync_7NZCNBPA.csv.gz
time=2021-04-07 23:56:16 logger_name=pipelinewise.fastsync.commons.target_snowflake log_level=INFO message=Loading snowflake-imports/pipelinewise_postgres-staging-tap_brand.trend_brand_profile_question_mapping_20210407-235606-434637_fastsync_7NZCNBPA.csv.gz into Snowflake...
time=2021-04-07 23:56:19 logger_name=pipelinewise.fastsync.commons.target_snowflake log_level=INFO message=Loading into BRAND."TREND_BRAND_PROFILE_QUESTION_MAPPING_TEMP": {"inserts": 0, "updates": 0, "size_bytes": 147}
time=2021-04-07 23:56:19 logger_name=pipelinewise.fastsync.commons.target_snowflake log_level=INFO message=Deleting snowflake-imports/pipelinewise_postgres-staging-tap_brand.trend_brand_profile_question_mapping_20210407-235606-434637_fastsync_7NZCNBPA.csv.gz from S3...

Note the output:

{"inserts": 0, "updates": 0, "size_bytes": 147}
0

There are 0 best solutions below