I am running a Docker container locally but it is unable to connect to my AWS Services like Secrets Manager or DynamoDB.
When you are developing an application locally that connects to AWS Services like Secrets Manager or DynamoDB the application can use your locally AWS credentials to authenticate with AWS and access those services.
However if you want to containerise the application and run it in Docker locally, the application won’t be able to access the AWS credentials on your local PC.
This is because the application is running inside the container and can’t access data outside the container.
The easy but very bad solution is to pass your AWS credentials into the container as environment variables. But this has all kinds of security implications and is bad practice.
The solution is to vend the credentials into the container at runtime.
The solution is to use docker-compose to run the application locally along with a docker-compose.override file provided by AWS.
The override acts as a proxy between your local pc and the application running in the docker container as it vends credentials into the your applications container.
We will assume that you have created a Dockerfile for your application that containerises your application.
Then create a
docker-compose.yml file that will start your application.
Then create a
docker-compose.override.yml file which will bend the credentials to your container at runtime.
Remember to set the correct region on line 43 of the
Then simply run your application using the terminal command:
docker-compose up --build
This will start both your application container and the credential vendor service.
Your application will now be able to access all of your AWS services from within the Docker container.
This is a really useful tool for local container development.