I am using databricks autoloader. Here, the table schema will be dynamic for the incoming data. I have to store the schema in some file and read it in autoloader during readStream.
How can I store the schema in a file and in which format?
Whether the file can be read using schema option or "cloudFiles.schemaLocation" option?
spark.readStream.format("cloudFiles").schema("<schema>").option("cloudFiles.schemaLocation", "<path_to_checkpoint>").option("cloudFiles.format", "parquet").load("<path_to_source_data>")