I am running docker compose up
which consists of multiple containers on of which is python 3.* and all the containers have volumes attached to them.
also I have already created requirements.txt file
I have entered python container and install x packages then I did
pip freeze > requirements.txt
I then I stoped the containers and restart the containers again, but python container didn’t start and the log says modules x is not found
, so what I did is that O deleted the container and created a new one and it worked,
my questions is, Is there any way to not deleting the container (I think its over kill)
but some-who still able to manage installing packages in the container?
Dockerfile
FROM python:3.6
RUN apt-get update
RUN apt-get install -y gettext
RUN mkdir -p /var/www/server
COPY src/requirements.txt /var/www/server/
WORKDIR /var/www/server
RUN pip install -r ./requirements.txt
EXPOSE 8100
ENTRYPOINT sleep 3 && python manage.py migrate && python manage.py runserver 0.0.0.0:8100
2
Answers
You should move your project source files into the container during build and within in it run the
pip install -r requirements.txt
.Below is an example to give you an idea:
Finally, you will use docker-compose build service to build the define service in the docker-compose.yml pointing to the Dockerfile in the build context.
Broadly, set up your Dockerfile such that you need to do the least-changing and most time-costly work first
as @coldly says in their Answer, write your dependencies into a requirements file and install them during the container build!