I'm attempting to create a Synapse table in delta-lake with spark . Source files are small files in json format and often contain frequent updates about the same item . So I planned the table to keep the most latest update about the item .
To Optimize the deltatable I'm using the following command before appending the new records to existing table .
spark.sql("SET spark.databricks.delta.autoOptimize.optimizeWrite = true")
But I still get a notification . I'm sure that this means the autooptimize is not working . How do I proceed to set the autooptimize flag ?
Thanks for the help .
