Can I see log output in the terminal when using functions-framework and Google Cloud Logging Python v3.0.0

1k Views Asked by At

I am writing a new cloud function and am using the new Google Cloud Logging library as announced at https://cloud.google.com/blog/products/devops-sre/google-cloud-logging-python-client-library-v3-0-0-release.

I am also using functions-framework to debug my code locally before pushing it to GCP. Setup and Invoke Cloud Functions using Python has been particularly useful here.

The problem I have is that when using these two things together I cannot see logging output in my IDE, I can only see print statements. Here's some sample code that does NOT use google.cloud.logging, this DOES successfully cause logging output to appear in my terminal:

from flask import Request
from google.cloud import bigquery
from datetime import datetime
import google.cloud.logging
import logging

logger = logging.getLogger()
logger.setLevel(logging.INFO)

def main(request) -> str:
    #
    # do stuff to setup a bigquery job
    #
    bq_client = bigquery.Client()

    job_config = bigquery.QueryJobConfig(labels={"key": "value"})
    nowstr = datetime.now().strftime("%Y%m%d%H%M%S%f")
    job_id = f"qwerty-{nowstr}"

    query_job = bq_client.query(
        query=export_script, job_config=job_config, job_id=job_id
    )
    print("Started job: {}".format(query_job.job_id))
    query_job.result()  # Waits for job to complete.
    logging.info(f"job_id={query_job.job_id}")
    logging.info(f"total_bytes_billed={query_job.total_bytes_billed}")

    return f"{query_job.job_id} {query_job.state} {query_job.error_result}"

Here is the output

Started job: qwerty-20220306211233889260
INFO:root:job_id=qwerty-20220306211233889260
INFO:root:total_bytes_billed=31457280

As you can see the call to print(...) has outputted to my terminal and the call to logging.info(...) has as well.

If I change the code to also use google.cloud.logging

from flask import Request
from google.cloud import bigquery
from datetime import datetime
import google.cloud.logging
import logging

logger = logging.getLogger()
logger.setLevel(logging.INFO)
log_client = google.cloud.logging.Client()
log_client.setup_logging()

def main(request) -> str:
    #
    # do stuff to setup a bigquery job
    #
    bq_client = bigquery.Client()

    job_config = bigquery.QueryJobConfig(labels={"key": "value"})
    nowstr = datetime.now().strftime("%Y%m%d%H%M%S%f")
    job_id = f"qwerty-{nowstr}"

    query_job = bq_client.query(
        query=export_script, job_config=job_config, job_id=job_id
    )
    print("Started job: {}".format(query_job.job_id))
    query_job.result()  # Waits for job to complete.
    logging.info(f"job_id={query_job.job_id}")
    logging.info(f"total_bytes_billed={query_job.total_bytes_billed}")

    return f"{query_job.job_id} {query_job.state} {query_job.error_result}"

Then I don't see the logging output in the terminal:

Started job: provider-egress-hastings-20220306211718088936

Is there a way to redirect logging output to my terminal when running locally using functions-framework and when using google.cloud.logging but not affect logging when the function is running as an actual cloud function in GCP?

1

There are 1 best solutions below

1
On

i'm using an environment variable in local testing to configure a console handler:

log_client = google.cloud.logging.Client()
log_client.setup_logging(log_level=logging.DEBUG)
logger = logging.getLogger()
if os.getenv("LOCAL_LOGGING", "False") == "True":
    # output logs to console - otherwise logs are only visible when running in GCP
    logger.addHandler(logging.StreamHandler())