I am working on an ansible script where I want to read a file on each host and set an environment variable for that host based on some text in that file. And I need that environment variable to be available during the entire playbook execution on that host.
What I have been reading is that if I define env: under a task, it is applicable only to that task and not other subsequent tasks. Is that correct?
- name: Modify server properties
hosts: kafka_broker
vars:
ansible_ssh_extra_args: "-o StrictHostKeyChecking=no"
ansible_host_key_checking: false
contents: "{{ lookup('file', '/etc/kafka/secrets/masterkey.txt') }}"
extract_key: "{{ contents.split('\n').2.split('|').2|trim }}"
environment:
CONFLUENT_KEY: "{{ extract_key }}"
This is how I am trying to get info from each host and and want to set the env variaable per host but applicable to the entire playbook for that host
- name: Slurp hosts file
slurp:
src: /etc/kafka/secrets/masterkey.txt
register: masterkeyfile
- debug: msg="{{ masterkeyfile['content'] | b64decode }}"
- name: Set masterkeyfilecontent
set_fact:
masterkeyfilecontent: "{{ masterkeyfile['content'] | b64decode }}"
- name: Set masterkeyval
set_fact:
masterkeyval: "{{ masterkeyfilecontent.split('\n').2.split('|').2|trim }}"
And then I want to set the env variable per host
CONFLUENT_KEY: "{{ masterkeyval }}
- debug:
var=masterkeyval
Can that be done? How can I define my task / ansible script that will allow me to achieve this?
Thank you
Solution 1: local facts
This solution is IMO the easiest one but requires to place a file on each target server.
let's imagine you put the following executable file in
/etc/ansible/facts.d/kafka.fact. This is only a dummy example, adapt to your exact needs. I'm using jq to output a proper json string. You canechodirectly if you trust the key content will not cause problems. You can also use any other executable you like (python, ruby, perl...) as long as you output a json structureOnce this is done, you can see that the facts are available for the given host. I'll only demonstrate for localhost here but this will work with any host having this local facts script:
You are now ready to use this fact wherever you need it. Note that you must of course gather facts for this value to be available. In your case, we can test with the following playbook:
which gives
Solution 2: custom facts module
This solution is a bit more complex but probably has better management capabilities on the long term and does not require to place files on the target server.
To be as straightforward as possible, I placed my demo facts module in a
libraryfolder adjacent to my playbook. Placing such a module inside a collection or a role is preferable for a production project but goes beyond this (already long) answer. To get more info on all these subjects you can read (non exhaustive list of references):Create a
library/kafka_facts.pyfile. You will have to adapt to your exact situation. In this case, I decided that the key would be placed on a single line in/tmp/keyfile.txtand I hardcoded this in my module. Note that the fact is not returned if the file does not exist. I added all document strings as advised in the examples in the above documentation link.As in my previous solution, the following demo will be done on localhost only but will work on any target server.
First we create a key in the expected file
And we can now use the module as demonstrated in the following playbook:
which gives: