How can I determine the type of field of a calculated field on a query I´m doing on a kaggle notebook?

24 Views Asked by At

I'm doing a project on kaggle with bigquery but I'm getting a warning.

My query runs perfectly fine in Bigquery but has this issue on my kaggle notebook:

/opt/conda/lib/python3.10/site-packages/google/cloud/bigquery/_pandas_helpers.py:244: UserWarning: Unable to determine type for field 'average_ride_length'.
  warnings.warn("Unable to determine type for field '{}'.".format(bq_field.name))

It gives me the result in days and milliseconds is there a way to make it days and hours or just hours? this is part my query:

# Query 1
query1 = """ 
SELECT
  COUNT(ride_id) AS total_of_rides,
  AVG(ended_at-started_at) AS average_ride_length,
  MAX(ended_at-started_at) AS max_ride_length,
  MIN(ended_at-started_at) AS min_ride_length
FROM
(
SELECT * FROM cyclistic_data.january
UNION ALL
SELECT * FROM cyclistic_data.february
UNION ALL
SELECT * FROM  cyclistic_data.march
UNION ALL
SELECT * FROM  cyclistic_data.april
UNION ALL
SELECT * FROM  cyclistic_data.may
UNION ALL
SELECT * FROM  cyclistic_data.june
UNION ALL
SELECT * FROM  cyclistic_data.july
UNION ALL
SELECT * FROM  cyclistic_data.august
UNION ALL
SELECT * FROM  cyclistic_data.september
UNION ALL
SELECT * FROM  cyclistic_data.october
UNION ALL
SELECT * FROM  cyclistic_data.november
UNION ALL
SELECT * FROM  cyclistic_data.december
)
​
        """
# Set up the query
query_job1 = client.query(query1)
​
# API request - run the query and return a pandas DataFrame
data1 = query_job1.to_dataframe()
​
#See the resulting table
print(data1)
0

There are 0 best solutions below