Refreshing Lakehouse data during a notebook session in Microsoft Fabric

558 Views Asked by At

I am running a pyspark script in a notebook in Microsoft Fabric (preview).

The script gets the last modification time of test.csv, which is located in a lakehouse in the same workspace.

The problem is, as soon as you start the session of the notebook, the data of the lakehouse does not refresh for the script. So even if you replace test.csv, or even delete it, the script will only know about the test.csv which existed when the session started.

Is there a way to refresh the data during the session, or get access to the real file?

To get the last modification time i am using the following code:

last_modification_time = os.stat(file_path).st_mtime

i also tried

last_modification_time = os.path.getmtime(file_path)
0

There are 0 best solutions below