Google Cloud Function Connection Error when Deployed but Works in Inline Editor

20 Views Asked by At

I have a Google Cloud Function written in Python that is designed to export data from BigQuery to Google Cloud Storage. The function works perfectly fine when I test it within the inline editor of Google Cloud Functions, but when I deploy it and trigger it via a Google Cloud Pub/Sub topic linked to Cloud Scheduler, it throws a connection error. No error messages are displayed in the logs.

I have thoroughly checked IAM permissions and billing status, and everything seems to be in order. Below is the code snippet of my function:

from google.cloud import bigquery
from google.cloud import storage
import csv
import io
def hello_pubsub(event, context):
    table_id = "example.eafineri.firfm"

    bq_client = bigquery.Client()
    table = bq_client.get_table(table_id)
    rows = bq_client.list_rows(table)

    csv_data = io.StringIO() 
    writer = csv.writer(csv_data, delimiter='|') 
    header_row = [field.name for field in table.schema]
    writer.writerow(header_row)
    
    for row in rows:
        writer.writerow([str(field) for field in row])

    bucket_name = 'test_test'
    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob('output1.csv')
    blob.upload_from_string(csv_data.getvalue(), content_type='text/csv')

    print("File uploaded successfully.")

I've ensured that the IAM permissions are set correctly and billing is enabled. However, when the function is triggered via Pub/Sub, it throws a connection error. Any insights on why this might be happening would be greatly appreciated. Thanks in advance!

0

There are 0 best solutions below