I'm very new to Google Cloud CLI, sorry for the dumb question but it's really annoying.
I'm trying to execute this command:
bq mk --transfer_config --target_dataset=ds_scrummate --display_name='map_email' --data_source=amazon_s3 --params='{"data_path_template":"s3://sm-bigquery/map_email/{run_date}/*.csv", "destination_table_name_template":"ing_map_email", "file_format":"CSV", "max_bad_records":"0", "ignore_unknown_values":"true", "field_delimiter":",", "skip_leading_rows":"1", "allow_quoted_newlines":"true", "allow_jagged_rows":"false", "delete_source_files":"true"}'
I always get this error message back:
BigQuery error in mk operation: Parameters should be specified in JSON format
when creating the transfer configuration.
The JSON file above seems to be valid (I've tested it with online validator). I've also tried the multiline version of this command with the same result.
Questions:
- What am I doing wrong? Why am I getting JSON error here, despite of the validated content?
- How could I get a more detailed error? Is there a verbose flag for this command which I couldn't find?
- Why isn't there any good documentation which describes this type of error in detail?
Sources I've used:
https://cloud.google.com/bigquery/docs/reference/bq-cli-reference#bq_mk https://cloud.google.com/bigquery-transfer/docs/working-with-transfers https://cloud.google.com/bigquery-transfer/docs/s3-transfer#bq https://cloud.google.com/bigquery/docs/bq-command-line-tool
The online parsers do give valid json structure, but it could be that the prompt is not recognizing
{run_date}as a string, I think it is trying to parse it also. Or maybe the comma in"field_delimiter":","should be scaped.--apilog <filename>outputs debug logs to a file.If my suspicion is correct, I think the error message points to the right direction, we need to verify what's going wrong in the json; the debug logs might contain more specific details on the section that was not parseable.