Optimize delta table with small file compaction Error in Azure Synapse

82 Views Asked by At

I'm attempting to create a Synapse table in delta-lake with spark . Source files are small files in json format and often contain frequent updates about the same item . So I planned the table to keep the most latest update about the item .

To Optimize the deltatable I'm using the following command before appending the new records to existing table .

spark.sql("SET spark.databricks.delta.autoOptimize.optimizeWrite = true")

But I still get a notification . I'm sure that this means the autooptimize is not working . How do I proceed to set the autooptimize flag ?

Thanks for the help .

enter image description here

0

There are 0 best solutions below