I have two docker containers. First one is pypiserver, that contains a package I've created. Second one is my flask app that will install that package from pypiserver. I build those containers with docker-compose, and after that I go into the app container and install that package. It works fine. However, when I tried to install that package in Dockerfile, while building the app, it does not work.
This is my docker-compose.yaml file:
version: '3.9'
services:
test-pypiserver:
image: pypiserver/pypiserver:latest
ports:
- 8090:8080
volumes:
- ./pypiserver/packages:/data/packages
networks:
- test-version-2-network
test-flask:
build: ./dashboard/.
container_name: test-flask
ports:
- 5000:5000
volumes:
- ./dashboard:/code
depends_on:
- test-pypiserver
networks:
- test-version-2-network
This is my Dockerfile for my flask app:
FROM python
WORKDIR /code
ENV FLASK_APP=app.py
ENV FLASK_RUN_HOST=0.0.0.0
ENV FLASK_RUN_PORT=5000
ENV FLASK_DEBUG=1
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
RUN pip install --trusted-host test-pypiserver --extra-index-url http://test-pypiserver:8080 osh
COPY . .
EXPOSE 5000
CMD [ "flask", "run" ]
When I command out this line from Dockerfile
pip install --trusted-host test-pypiserver --extra-index-url http://test-pypiserver:8080 osh
and use it in app container, it works properly
Is there any way to do that? Or is it the proper way to install my packages?
docker-compose up command first build containers and then (after that all containers are builded) make them running. When he try to build your flask application the pypiserver is not running already, so the package installation fails.
You can try to install package when the container is starting.