Datamechanics - spark docker image - example of how to use the connector that comes inbuilt with the image

591 Views Asked by At

I came across the below docker image for spark. The image also comes with some of the connectors to some of the popular cloud services. An example of how to use the inbuilt connectors(say Azure storage gen2) in pyspark application will be of great help.

link to dockerhub image : https://hub.docker.com/r/datamechanics/spark

I looked into the below example that was provided but it didn't help much in understanding how to use the connector that comes with the default image https://github.com/datamechanics/examples/blob/main/pyspark-example/main.py

1

There are 1 best solutions below

0
On

There is some more documentation at https://docs.datamechanics.co/docs/docker-images but it is not very helpful to understand how to use the images indeed.. The point that there is no Dockerfile and also no response to reported issues makes it very difficult.

It looks like https://g1thubhub.github.io/docker.html is helpful, although the versions of the images that are used are older.