trying with both the dataframe Api and the rdd API
val map =collection.mutable.Map[String, String]()
map("es.nodes.wan.only") = "true"
map("es.port") = "reducted"
map("es.net.http.auth.user") = "reducted"
map("es.net.http.auth.pass") = "reducted"
map("es.net.ssl") = "true"
map("es.mapping.date.rich") = "false"
map("es.read.field.include") = "data_scope_id"
map("es.nodes") = "reducted"
val rdd = sc.esRDD("index name", map)
rdd.take(1)
But anything I try I get this error
EsHadoopIllegalArgumentException: invalid map received dynamic=strict
I've tried limiting the fields being read with es.read.field.include But even if I choose one field which I'm sure doesn't have any varient I still get this error
How can I work around this? I'll be glad for any advice
Versions:
- eshadoop-7.13.4
- client Spark 3.1.2
- Scala 2.12
Clarification
This is about reading from elasticsearch in spark, not indexing
So if I understand correctly, your aim is to index the values in
maptoindex name.TLDR;
Update the mapping of your index to allow for new fields to be indexed. As of new the value of
dynamicisstrictwhich does not allow for new field and throw an exception.To understand
The issue is with the mapping of your index. There is a setting called [
dynamic(https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic.html) on the mapping of your index.I bet it is set to
strict, which according to the doc:So, my understanding is, you have one or many fields that are new in your document.
Either:
dynamictotrue,falseorruntimeaccording to your needs