I have a PrestaShop database running in a docker container that I intend to back up. The thing is, I need to put the e-commerce store into maintenance mode before I run it. Ideally, I intended to run the script on a remote machine – getting it to login via ssh and then interact with the container (via the script) but it seems I may (first) need to log in via ssh then run the script on the server? (as it starts a new shell etc).
These are the parts of the script that I have so far, but I’m just not sure how to put them together to string commands once inside mysql:
#!/bin/bash
# Log in to docker@debVM server via SSH
ssh docker@debVM
# Run the following Docker Container
docker exec -it ps1-7-mariadb-1 /bin/bash
# Login to the database bitnami_prestashop
mysql -u bn_prestashop -pbitnamiUSER bitnami_prestashop
# Run update - store into maintenance mode
UPDATE ps_configuration SET value='0' WHERE name='PS_SHOP_ENABLE';
#Exit the mysql and docker exec
Exit
#run the backup-volumes-container.sh script
echo running the backup-volumes-container.sh script
cd
cd scripts/
./backup-volumes-container.sh ps1-7-mariadb-1
./backup-volumes-container.sh ps1-7-prestashop-1
echo Backup-volumes complete
Hope someone can help – thanks!
2
Answers
If your goal is to run a command on the OS within your docker container, check out docker exec. You can install a mysql client in the dockerfile for your container, and then just use it via docker exec like you would on a host machine.
If you need to get data out of the container, you can map a log file or something to your container as a file volume and have your container write output to that file.
There’s an example of what I’m talking about here.
A shell script is a sequence of shell commands, which can be a different thing from what you’re typing in the terminal. Your script runs
ssh
; then when that completes, it (locally) runsdocker exec
; then when that completes, it (locally, outside a container) runsmysql
; and so on. You can in principle provide an input to a command using pipe syntax, but this gets complicated as you step down into several layers.Some commands have a way to provide their inner command as a command-line argument. For example,
mysql -e 'SELECT ...'
will run a single SQL call, ordocker exec container-name command arg1
will run a single command.But: I think you can run this completely locally, without any nested commands or
docker exec
at all.The one bit of setup you do need to do is to make the database accessible from outside a container. When you start the container, make sure you publish a port
If you use Compose, its
ports:
has the same syntax. If you remove the127.0.0.1
at the front, the container will be fully network-accessible, which I’m guessing you don’t want. The12345
can be any port number not otherwise in use, but the last number must be3306
to match the standard database port.Having done this, on the target system, you can now access the database on host
127.0.0.1
port12345
directly from the host, without using Docker tools. (Remember that IP address is very context-specific, and it means something different on the local system, the remote system, and in the container.)You can similarly use
ssh
to set up a port forward:Again,
23456
is any otherwise-unused local port, and12345
matches12345
from above. Now port23456
on the local system forwards to port12345
in the remote system, which forwards to port3306
in the container.With this setup, you can now directly run
mysql
locallyand similarly run your backup script locally, pointing at the port-forwarded database connection.