At work we are using docker container to run our script on a custom made Linux machine, with a touch screen display. I’m the QA there and I want to create some automation tests, that should perform virtual touch clicks.
My problem is, that the Linux machine has only basic packages and installing new ones is really tricky. Also to not mess around with the os itself, I want to use my automation script inside a docker container too. That would also give me the opportunity to test around with different packages.
I want to use pyautogui
to simulate inputs on the touch screen (I am aware that there is a reason why that is not so easy as it sounds). At the moment, I stuck at getting the docker container get access to the screen. When I ran a simple test script I get these two error messages
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/Xlib/support/unix_connect.py", line 76, in get_socket
s.connect('/tmp/.X11-unix/X%d' % dno)
ConnectionRefusedError: [Errno 111] Connection refused
Xlib.error.DisplayConnectionError: Can't connect to display ":0": [Errno 111] Connection refused
So I know that the problem lies in the restriction with X11. I was trying out different solution I found online, but none of them worked. My next step would be to learn more about X11
and how it works, but that could take some time.
Maybe some of you have a solution or a better way to do this. Also I’m apologizing if I confused some names related to these topic.
My Dockerfile
looks like this:
FROM ubuntu:latest as notebook
WORKDIR /automation_test
COPY /src /automation_test/src
RUN apt update && apt-get update
RUN apt -y install python3 python3-pip
RUN pip install --upgrade pip
RUN pip install -r src/requirements.txt
RUN apt-get -y install -y wget
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
&& echo "deb http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list
RUN apt-get update && apt-get -y install google-chrome-stable
RUN apt-get -y install x11-apps
RUN apt-get -y install xauth
RUN apt-get -y install openbox
RUN mkdir /root/.Xauthority
RUN HOST=cctv
RUN DISPLAY_NUMBER=$(echo $DISPLAY | cut -d. -f1 | cut -d: -f2)
RUN AUTH_COOKIE=$(xauth list | grep "$(hostname)/unix:${DISPLAY_NUMBER} " | awk '{print $3}')
RUN xauth add ${HOST}/unix: ${DISPLAY_NUMBER} MIT-MAGIC-COOKIE-1 ${AUTH_COOKIE}
...
and my docker-compose.yaml
like this:
version: '3.6'
x-base: &base
ipc: host
pid: host
privileged: true
network_mode: host
volumes:
- ./src/results:/automation_test/src/results
- ./src/tmp:/automation_test/src/tmp
- /tmp/.X11-unix:/tmp/.X11-unix
environment:
- DISPLAY=${DISPLAY}
extra_hosts:
- "host.docker.internal:host-gateway"
services:
notebook:
<<: *base
image: notebook
container_name: notebook
build:
context: .
target: notebook
command: ["/bin/bash"]
Disclaimer:
Most of that stuff in these files are solution I tried out, but did not work. So that is the reason it looks kinda ugly and messy. And I was using a Ubuntu base image for the moment. In the future I would like to only use a python image as as base.
2
Answers
So I came to the conclusion, that this project of mine is too complicated to solve for now. I also noticed that, it would not work in the first place, because the linux machine, on which I wanted to run the repository, does not even have
X11
(could have checked that in the first place honestly).A coworker actually gave me a better approach to this topic. He found this repository which is working directly with the kernel itself and this.
I think this is a much better solution than trying to get X11 running in a docker.
Anyway, thanks for the help and your time. I appreciate it.
You can do x-forwarding like this :
Once this works, you can integrate it in your docker-compose workflow.