Running multiple DB connections in parallel from Jupyter Lab

160 Views Asked by At

I'm trying to connect to multiple data sources, Postgress, Snowflake and MySQL. In each of these sources I have some tables that I need to ingest in order to build my models correctly. I know I can do it for each of those individually with SQLAlchemy or JupySQL. Is there a way I can generalize my code so I instantiate the connection once, and pass the query with the connection in a loop to pull the different tables and later merge them locally?

Thanks!

A single connection:

pip install jupysql --quiet
%load_ext sql
%sql postgresql://username:password@host:port/database
%sql df << SELECT * FROM my_table;
0

There are 0 best solutions below