When trying to import python libraries at a spark pool level by applying an uploaded requirements.txt file and custom packages, I get the following error with no other details:
CreateOrUpdateSparkComputeFailed Error occured while processing the request
It was working perfectly fine few days back. Last upload was successful on 12/3/2021.
Also SystemReservedJob-LibraryManagement application job not getting triggered.
Environment Details:
- Azure Synapse Analytics
- Apache Spark pool - 3.1
We tried below things:
- increase the vcore size up to 200
- uploaded the same packages to different subscription resource and it is working fine.
- increased the spark pool size.
Please suggest
Thank you
Make sure you have below packages in your requirement.txt
Before that we need to check about the packages which are installed and which are not. You can get all the details of packages install by running below lines of code and can conclude which packages are missing and can keep them in place:
Install the missing libraries with Requirement.txt.
I faced the similar use case where I got good information and step procedure from MS Docs, have a look on it to handle workspace libs