I have a scheduled query running every 24 hours that is loading some data into BQ table (this data is about some problems from the previous day). Additionally, I want to have an alert whenever the daily partition of that table is not empty. For this case I created an alert with BigQuery Dataset - Uploaded rows metric. Unfortunately, this metric behaves as if no data was added to my dataset. Does it mean that data added by a scheduled query is not counted there? If yes - how can I create an email alert (not to myself, to some other email) a BQ daily partition of a table contains any rows?
BigQuery Dataset - Uploaded rows with Scheduled query
28 Views Asked by Maciej Jankowski At
1
There are 1 best solutions below
Related Questions in GOOGLE-BIGQUERY
- SQL LAG() function returning 0 for every row despite available previous rows
- Convert C# DateTime.Ticks to Bigquery DateTime Format
- SELECT AS STRUCT/VALUES
- Google Datastream errors on larger MySQL tables
- Can i add new label called looker-context-look_id in BigQuery connection(Looker)
- BigQuery external table using JSON files
- Does Apache Beam's BigQuery IO Support JSON Datatype Fields for Streaming Inserts?
- sample query for review for improvement on big query
- How does Big Query differentiate between a day and month when we upload any CSV or text file?
- How to get max value of a column when ids are unique but they are related through different variables
- how to do a filter from a table where 2 different columns has 2 different records which has same set of key combinations in bigquery?
- How to return a string that has a special character - BigQuery
- How do I merge multiple tables into a new table in BigQuery?
- Customer Churn Calculation
- Is it correct to add "UNNEST" in the "ON" condition of a (left) join?
Related Questions in DATASET
- How to add a new variable to xarray.Dataset in Python with same time,lat,lon dimensions with assign?
- Power BI Automations of Audits and APIs
- Trouble understanding how to use list of String data in a Machine Learning dataset - Features expanded before making prediction
- how to difference values within several panels
- How to use an imported Excel file inside Anylogic model
- Need to be able to load different reports into the same report viewer, based on the selection of a combobox value How do i do this?
- Can i merge my custom model and pretrained model in yolov9
- How to access the whole public dataset hosted on a website?
- Use dataset name in knitr code chunk in R
- How many images should I label from the training set?
- How to get a list of numbers out of an awk output in bash
- Wrong file reading in Jupyter
- Request for Rui Li twitter dataset
- Illustrator file to single word Dataset
- Image augmentation for dataset creation
Related Questions in GOOGLE-ALERTS
- BigQuery Dataset - Uploaded rows with Scheduled query
- How to review emails markes as phishing in Google Workspace - Alert Center
- Understanding the pricing related to Google Cloud Alerting based on Metric Explorer
- Extract data from Google Alerts xml through xPath in Sharepoint
- GCP alerting SMS charges
- Google cloud alert policies - how to get error message within email body/subject
- Including bigquery payload in alert notification
- Enable notifications on CloudSQL scheduled/sudden maintenance
- How to configure Google Workspace Alert Center to publish alerts to a PubSub topic?
- Incident triggered on Absence condition type when log-based metric is not absent
- Metric for Number of unacknowledged messages older than 20 minutes
- Can Google Workspace Alert Center be used with Email Log Search?
- How can i get the email notification for Cloud functions logs?
- Create notifications when GCP secret nears expiry
- GCP Alert notification is sending only once
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Scheduled or not, the data will be counted towards all metrics.
In GBQ the load jobs are "batch" jobs. One can insert as much data as possible but it will be processed when resources are available in the GBQ backend. It's possible that sometimes a user creates a job and you have to wait if the worker pool is resizing or full or something along those lines.
For the same reasons, your data may not be immediately accessible or some metric may not have updated in time. The larger the data you are trying to add in, the longer it may take to show up.
Usually, I work with very large amounts of data, so as a standard practice I also add a few minutes of sleep timer before I want to access that data again. In addition you can also check endTime of the job to see if it's completely done or not.
As far as custom email alerts or checking some preconditions like if a partition is empty etc are concerned you have to manage those by yourself in your program. (if you are not using any GBQ workflow platforms like Magnus (https://potens.io/products/#magnus))