I'm trying to use a "spark_conf {}" block in my databricks cluster resource block in Terraform. This block accepts a key and a value, but I would like to add multiple of these. I was able to do it hardcoded, like this:
spark_conf = {
(var.spark_configs[0].key) : var.spark_configs[0].value,
(var.spark_configs[1].key) : var.spark_configs[1].value,
(var.spark_configs[2].key) : var.spark_configs[2].value,
(var.spark_configs[3].key) : var.spark_configs[3].value,
}
Where the variable "spark_configs" is a list of objects.
This works, but is not really dynamic, so I'm hoping to get a solution that will just loop over the list of objects.
I tried with a dynamic block, but this will enter a "spark_conf" block for every key-value pair in the list, that's not the intention, they should all end up in one block (I think, correct me if I'm wrong)
Any ideas on how to attack this?
Thanks!
Yes you can use a
forexpression with amapconstructor for this. Note that there is a misconception in the question thatspark_confis a block; it is a parameter argument that accepts amaptype. Therefore a dynamic block could not be used in this situation regardless. The expression would appear like:Note that if you optimally restructured your variable
spark_configsto be amapinstead of alist(object)e.g.:then this would simplify the parameter argument assignment to: