We want to use the same (staging) database for our project within our team. So we can develop on our local machines with DDEV and can share the database, so we dont have to export the database and import it on anothers team members machine. The DDEV Project is also being stored in a GIT Repo, so each teammember can pull the newest settings of the DDEV Container onto their local machine.
I can access the external Database if I modify the AdditionalConfiguration.php, but if the DDEV Container gets restarted then it overwrites the AdditionalConfiguration.php file. I haven’t found a way to set the database connection within the ddev contaienr config, so when the container gets started it sets the correct Database connection (external Database) in the AdditionalConfiguration.php.
Does anyone know how to set a custom database conenction in a DDEV Container?
Thanks for the Help.
My Environment and TYPO3 Version:
-
TYPO3 v10.4.20
-
Windows 10 (WSL)
-
Docker Desktop 3.5.2
-
DDEV-Local version v1.17.7
-
architecture amd64
-
db drud/ddev-dbserver-mariadb-10.3:v1.17.7
-
dba phpmyadmin:5
-
ddev-ssh-agent drud/ddev-ssh-agent:v1.17.0
-
docker 20.10.7
-
docker-compose 1.29.2
-
os linux
-
router drud/ddev-router:v1.17.6
-
web drud/ddev-webserver:v1.17.7
2
Answers
There are many ways to manage your settings files in ddev, and you can tell ddev not to manage your settings files.
disable_settings_management: true
in your project config.yaml. Then you and you alone are responsible for the settings files.#ddev-generated
from it and check it in or whatever. Then ddev won’t touch it.For more detail and nuance, see the docs on this subject, https://ddev.readthedocs.io/en/latest/users/topics/cms-specific-help/
I think you could also overwrite the settings in the TYPO3 setup after the inclusion/execution of AdditionalSettings.php
Its possible as randy fay said. i would avoid a shared database vor Development instances.
Say team member A creates a new extension with some db tables and spends some time entering testing data.
Then team member B runs a database schema upgrade f
On his instance he might destroy a lot of work.
And there are a lot of similar cases where lokal configuration and Database goes hand in hand.
Our team has the following process established:
Code runs from dev to prod.
And data runs from prod to dev.
So a script creates a nightly db & fileadmin backup. Then creates a developer snapshot (by scrubbing sensitive and cache data)
This development dump is put on an SFTP server.
The latest dump is then imported nightly to pre-production
Once a week to to QA (so we have a week for the QA team to accept a change if need to create records like new pages)
And on demand for our ddev. And i think in practice we developers update the db every 1 or two months. Because most content changes are irrelevant for us developers.
We provideed a customer ddev script which downloads the latrst dump and imports it.