Set awslogs log driver on docker container

5.4k Views Asked by At

I want to pass my container logs to AWS Cloud Watch.

I'm not able to set the AWS credentials in the Docker Desktop for Mac.

Docker Version : Version 19.3.5 
Mac OS Version : 10.14.6

I have created ~/.aws/credentials files with the AWS credentials.

aws_access_key_id : XXXXXXX,
aws_secret_access_key" : XXXXXXXX

I tried to run the Docker with the following ways:

docker run --name flask -v ${HOME}/.aws/:/root/.aws/:ro -d --log-driver=awslogs --log-opt awslogs-region=XXXX --log-opt awslogs-group=XXXX --log-opt awslogs-create-group=true flask-image

The IAM user that configured policy is:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Effect": "Allow",
            "Resource": "*"
        }
    ]
}

But got the following error:

docker: Error response from daemon: failed to initialize logging driver: failed to create Cloudwatch log stream: NoCredentialProviders: no valid providers in chain. Deprecated.
    For verbose messaging see aws.Config.CredentialsChainVerboseErrors.

I have also added to the docker.json file.

And in the Docker Daemon file put the AWS logging options.

{
"log-driver" : "awslogs",
"log-opts" : {
"awslogs-region" : "xxxx",
"awslogs-group" : "xxxxx",
"awslogs-stream" : "xxxxx"
}
}

From investigation I found that I need to set the AWS credentials inside the Docker daemon and not enough on my Docker host. As you see I tried to do this with the volume but no success.

I'm running the Docker on my Mac, does anyone have an idea how to solve this issue?

3

There are 3 best solutions below

0
On

I am having this very same issue myself.

As you well suggested, it is not enough to provide aws credentials to the client. As per docker documentation:

You must provide AWS credentials to the Docker daemon

Now, running docker on a mac machine there are 3 different scenarios:

  • MacOS environment (Needless to say, credentials are not being exported from here)
  • VM environment (Oracle VirtualBox running boot2docker)
  • Container environment (tried mounting .aws/credentials here but did not work)

I will try mounting credentials in VM environment see if that helps.

If I am not wrong, docker daemon is running on VM environment so mounting AWS creds there could potentially fix it.

UPDATE:

Working temporary Fix with 3rd party library: https://github.com/nearform/docker-cloudwatch/blob/master/index.js

0
On

You can try to add environment variable to docker.service file pointing to your credential file.

invoke command to find docker.service file:

# systemctl status docker.service
 docker.service - Docker Application Container Engine
   Loaded: loaded (/lib/systemd/system/docker.service; disabled; vendor preset: enabled)

edit /lib/systemd/system/docker.service file.

In [Service] section add:

[Service]
Environment=AWS_SHARED_CREDENTIALS_FILE=<path_to_aws_credentials_file>
0
On

It looks like you are missing -t option in your docker run command. Look at this pushing logs to cloudwatch