I'm very new to Google Cloud CLI, sorry for the dumb question but it's really annoying.
I'm trying to execute this command:
bq mk --transfer_config --target_dataset=ds_scrummate --display_name='map_email' --data_source=amazon_s3 --params='{"data_path_template":"s3://sm-bigquery/map_email/{run_date}/*.csv", "destination_table_name_template":"ing_map_email", "file_format":"CSV", "max_bad_records":"0", "ignore_unknown_values":"true", "field_delimiter":",", "skip_leading_rows":"1", "allow_quoted_newlines":"true", "allow_jagged_rows":"false", "delete_source_files":"true"}'
I always get this error message back:
BigQuery error in mk operation: Parameters should be specified in JSON format
when creating the transfer configuration.
The JSON file above seems to be valid (I've tested it with online validator). I've also tried the multiline version of this command with the same result.
Questions:
- What am I doing wrong? Why am I getting JSON error here, despite of the validated content?
- How could I get a more detailed error? Is there a verbose flag for this command which I couldn't find?
- Why isn't there any good documentation which describes this type of error in detail?
Sources I've used:
https://cloud.google.com/bigquery/docs/reference/bq-cli-reference#bq_mk https://cloud.google.com/bigquery-transfer/docs/working-with-transfers https://cloud.google.com/bigquery-transfer/docs/s3-transfer#bq https://cloud.google.com/bigquery/docs/bq-command-line-tool
Unfortunately the error message from the CloudSDK is not very helpful. I had the same message ("Parameters should be specified in JSON format") trying to create a scheduled query in Powershell.
To find the error, I modified "bq\bigquery_client.py" to output the Exception caused by
json.loads(params)
. The real error was "Expecting property name enclosed in double quotes". This helped me to find the solution: I had to put the property names and values inside the params in double double quotes, for example:--params='{ ""query"": ""test"" }'