I have multiple Python scripts from which I want to run a docker container. From a related question How to run multiple Python scripts and an executable files using Docker? , I found that the best way to do that is to have run.sh
a shell file as follows:
#!/bin/bash
python3 producer.py &
python3 consumer.py &
python3 test_conn.py
and then call this file from a Dockerfile as:
FROM python:3.9
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app
RUN pip install --no-cache-dir -r requirements.txt
COPY . /usr/src/app
CMD ["./run.sh"]
However, in the container logs the following error is prompting exec ./run.sh: no such file or directory
, which makes no sense to me since I copied everything on the current directory, run.sh included, to /usr/src/app on my container via COPY . /usr/src/app
Please, clone my repo and on the root directory call docker-compose up -d and check myapp container logs to help me.
https://github.com/Quilograma/IES_Project
Thank you!
Can’t run multiple python scripts in a single container.
4
Answers
For those encountering the same problem adding
RUN sed -i -e 's/r$//' run.sh
on the Dokcerfile beforeCMD ["bash", "-c", "./run.sh"]
is what worked for me. See Bash script – "/bin/bash^M: bad interpreter: No such file or directory" for further details.You should explicitly specify what shell interpreter be used for running your script.
Changing the last line to
CMD ["bash", "-c", "./run.sh"]
might solve your issue.If you need to run three separate long-running processes, do not try to orchestrate them from a shell script. Instead, launch three separate containers. If you’re running this via Compose, this is straightforward: have three containers all running the same image, but override the
command:
to run different main processes.Make sure the scripts are executable (run
chmod +x producer.py
on your host system, and commit that to source control) and begin with a "shebang" line#!/usr/bin/env python3
as the very first line.You need to chmod run.sh to be excecuteable: