How to create single Dockerfile for multiple containers without using docker-compose.yml?

729 Views Asked by At

Here I have 2 cases:

Case 1: I have created a Django project called Counterproj which uses a default db called Sqlite3. I have dockerized this project using Dockerfile and pushed to Azure ACI which worked well in both the cases local and cloud.

Case 2: The problem starts here when I migrated from Sqlite3 to PostgreSQL I have dockerized Django project using Dockerfile and docker-compose.yml with the services web and db these are working fine in my local but when I Push to Azure ACI the containers Counterproj_web:test and postgres:test are not able to communicate with each other and the instance is being terminated.

Here my query is that can we create a single Dockerfile without using any docker-compose.yml for Django project containerization using PostgreSQL as db. If we can create single dockerfile please suggest me the best way which should run both in local and in cloud.

Below are my Dockerfile, docker-compose.yml and database settings for your reference.

Dockerfile:

#syntax=docker/dockerfile:experimental

# Python Version(Base Image)
FROM python:3

# Set Envirment variable 
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# Set Working directory
WORKDIR /app

# Add all the things into app
ADD . /app

# Instruct docker to install all the requirements to run the project
# Copy current requirements to your docker image
COPY ./requirments.txt /app/requirments.txt

RUN --mount=type=cache,target=/root/.cache/pip pip install -r requirments.txt

COPY . /app

# install dependencies
RUN pip install --upgrade pip

docker-compose.yml:

version: '3.3'
services:
  db:
    image: postgres:test
    volumes:
      - ./db/:/var/lib/posgresql/data    
    environment:
      - POSTGRES_DB=Counterproj
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=12345
         
  web:
    image: Counterproj_web:test
    build: .
    command: python manage.py runserver 0.0.0.0:80
    ports:
      - "80:80"
    volumes:
      - .:/app    
    depends_on:
      - db

settings.py:

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'Counterproj',
        'USER': 'postgres',
        'PASSWORD': '12345',
        'HOST': 'db',
        'PORT': 5432,
    }
}
2

There are 2 best solutions below

1
On

You should always use the Docker-Compose file. It would be easier for you to maintain all the parameters properly required to run the Docker containers.

Otherwise, you have to pass the parameters directly during the time of running the containers.

1
On

One container - one purpose. Putting your DB and application breaks this best practice. Always have two containers - one for the DB and one for your application. Using a a docker-compose file will be enough for your containers to communicate.

A possible issue that I'm seeing is that you're mounting some local directories - are you sure the files on your remote machine are exactly there?