Spark: While using Speculation property, application is failing randomly with Failed to CREATE_FILE

118 Views Asked by At

I have enabled speculative property in my algorithm. With the property, when i executed the Job, Out of 10 times, 8 times the Job completed without any issue and it completed fast as well. But 2 times the application is failing with below error message. It fails randomly.

Even though the speculative property is giving us the benefit because of the random failures I'm not in a state to use this Property in Production. Could you please help to fix this problem.

ERROR MESSAGE:-

"Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException): 
Failed to CREATE_FILE hdfs://.../.spark-staging-c103afe4-4b1d-4105-8e1a-b67ad2347aef/calendar_date=2021-02-28/iteration=2/part-02528-c103afe4-4b1d-4105-8e1a-b67ad2347aef.c000.snappy.orc 
for DFSClient_NONMAPREDUCE_-321811341_100 on 172.22.48.145 because this file lease is currently owned by DFSClient_NONMAPREDUCE_-537211732_100 on 172.22.48.145"
0

There are 0 best solutions below