Here's my setup:
- I am using bunyan for logging in my nodejs server (https://cloud.google.com/logging/docs/samples/logging-bunyan-quickstart)
- I have created a log router to route my bunyan logs to a BigQuery dataset as the sink
- My log router has a simple inclusion filter that filters the log entries that go into the BQ dataset
My problem: I am unable to figure out how to make my BQ dataset update automatically as new log entries continue getting routed to the sink.
My log router sink is visible here: https://console.cloud.google.com/logs/router?project= with a "Last updated" timestamp. I am certain that new logs have been created that pass the filter, but the "Last updated" time is unchanged and I don't see the entries in my BQ dataset.
I went to documentation and all the settings I could think of, but haven't been able to figure out a way to have the logs show up in BQ dataset automatically. Documentation says that ~1min delay is expected, but I don't see any new logs show up.
What am I missing?