Getting an error while writing to Elastic search from spark with custom mapping id

1.7k Views Asked by At

I'm trying to write a dataframe from spark to Elastic with a custom mapping id. and when I do that I'm getting the below error.

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 14.0 failed 16 times, most recent failure: Lost task 0.15 in stage 14.0 (TID 860, ip-10-122-28-111.ec2.internal, executor 1): org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: [DataFrameFieldExtractor for field [[paraId]]] cannot extract value from entity [class java.lang.String] | instance

and below is the configuration used writing to ES.

var config= Map("es.nodes"->node,
 "es.port"->port,
 "es.clustername"->clustername,
 "es.net.http.auth.user" -> login,
 "es.net.http.auth.pass" -> password,
 "es.write.operation" -> "upsert",
 "es.mapping.id" -> "paraId",
 "es.resource" -> "test/type")

df.saveToEs(config)

I'm using the 5.6 version of ES and 2.2.0 of Spark. Let me know if you guys have any insight on this.

Thanks.!

0

There are 0 best solutions below