BigQuery Storage API has a 6 hours session timeout https://cloud.google.com/bigquery/docs/reference/storage/rpc/google.cloud.bigquery.storage.v1beta2#bigqueryread. While reading 100s of TBs of data with 1k maximum allowed streams 6h is not enough time to read all the data.
Given these limitations, I want to know if there is a scalable way to read large datasets from BQ via BQ Storage API? If not, are there any alternatives that scale for reading large datasets from BQ?
Your use case is not yet covered.
According to the current situation, you can
if you say you have a project and use case, where you cannot finish with 1000 streams in 6 hours, I would suggest contacting your Google Cloud representative and seeing if some of the above quotas can be raised for your project.