Airflow 1.1.10 remote S3 logs

67 Views Asked by At

I am trying to enable Remote Airflow logs, to do that I followed these steps:

apache-airflow install

pip install apache-airflow[crypto,postgres,ssh,s3,log]==1.10.10

Airflow.cfg file:

remote_logging = True
remote_base_log_folder = s3://bucket-name/logs
encrypt_s3_logs = False
logging_level = INFO
fab_logging_level = WARN
remote_log_conn_id = MyS3Conn

I have Airflow running in a docker container in an AWS ECS Blue/Green deploy. I read that if airflow is hosted on an EC2 server you should create the connection leaving everything blank in the configuration apart from connection type which should stay as S3.

The S3hook will default to boto and this will default to the role of the EC2 server you are running airflow on. Assuming this role has rights to S3 your task will be able to access the bucket.

So I applied this, but I don't know if using docker it works as intended.

If I run a dag I see the logs which are createds in the /urs/local/airflow/logs folder in the docker container, but there is no new files in the specified folder in S3.

0

There are 0 best solutions below