skip to Main Content

community!

I developed a few services in python that are supposed to communicate via rabbitmq. Python scripts look correct. Event after I moved them to docker containers, they continue look working (‘Waiting for messages’ line is generated by the payload script):

grafalex@debian:~/Work/partitioner/worker$ docker run -it --network partitioner_pocnetwork worker:latest
 [*] Waiting for messages. To exit press CTRL+C

The issue is that once I am running the same through docker-compose I can no longer see any logs. Sometimes part of the log may appear in the console, but just a part.

grafalex@debian:~/Work/partitioner/worker$ docker-compose up worker1
Starting partitioner_worker1_1 ... done
Attaching to partitioner_worker1_1

Here is the docker-compose.yml snippet for this

version: "3.9"
services:
  worker1:
    image: worker:latest
    restart: always
    networks:
      - pocnetwork
...
networks:
  pocnetwork:

What can be the problem with missing logs? Can it be some kind of stdout buffering (and if so how can I fix that)?

2

Answers


  1. Chosen as BEST ANSWER

    The problem is not related to networking. The problem is in buffered output. As @DavidMaze pointed in the comment, solution is well described in Python app does not print anything when running detached in docker.

    I just had to add '-u' flag to the python interpreter to make the output unbuffered, so that it appears in the log immediately.


  2. Since you seem to be using a pre-existing docker network, the equivalent to your docker run command in docker-compose should look like:

      worker1:
        image: worker:latest
        restart: always
        networks:
          - pocnetwork
      networks:
        pocnetwork:
          external: true
          name: partitioner_pocnetwork
    

    You need to declare the external network first. Does this solve your problem?

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search