I'm seeking guidance on how to create a continuous and timely process in my organization. Currently, I have a Python notebook that extracts data from a database and performs various processing tasks on the extracted tables. The processed data is then exported to a tabular file.
Now, my goal is to establish an interface where the processed tables can be easily accessed as a source of data through Tableau. I want to ensure that the data displayed on the Tableau dashboard is up-to-date and reflects the latest processing results, based on a predefined timing.
Here are the key points:
Running a Python notebook: I have a Python notebook that extracts data from a database and performs necessary processing tasks on the extracted tables.
Exporting processed data: The processed tabular data is exported to a file, and I would like to utilize Tableau to visualize this data effectively.
Creating a Tableau interface: I need assistance in producing an interface that allows the processed tables to be viewed as a source of data through Tableau. The interface should provide up-to-date information based on the timing of the process.
Timely updates: It is crucial that the code runs at specific intervals, completes the processing tasks, and updates the Tableau dashboard with the most recent processed information.
I would greatly appreciate any suggestions or insights on how to achieve this continuous and timely process. If you have experience with integrating Python notebooks, databases, and Tableau, your input would be particularly valuable.
Thank you in advance for your help!
any suggestions? which package / way to do it?
couldnt find a way to brodact direct DF to Tableau without create files such as CSV