skip to Main Content

I have created MERN application a little bit ago and deployed it on local server with pm2 package and run API and React App as separate services. As I wanted to dockerize everything, I created Dockerfile in React App and in API and then created docker-compose.yaml:

version: "3.9"

services:
  mongo:
    image: mongo:latest
    ports:
      - 27017:27017
    volumes:
      - ./mongo-db:/var/lib/mongo/data
    networks:
      - project-network
  api:
    container_name: project_api
    restart: unless-stopped
    image: project_api:1.0.0
    build:
      context: back-end
      dockerfile: Dockerfile
    ports:
      - 4001:4001
    networks:
      - project-network
    depends_on:
      - mongo
  client:
    container_name: project_client
    restart: unless-stopped
    image: project_client:1.0.0
    build:
      context: front-end
      dockerfile: Dockerfile
    ports:
      - 3000:3000
    networks:
      - project-network
    depends_on:
      - api

networks:
  project-network:

Everything works perfectly OK and there are no errors in execution of this. The problem is that I already have in my local mongodb too many collected data that of course are not presented in the dockerized application (as mongodb is running its own container service). How I can have my data in the application? I see 3 options:

  1. Somehow copy the data inside the container
  2. Run the mongodb service outside of the composed image
  3. Manually to re-enter all data (it is very big so for me it is not very good option)

For every option there are some questions. If the best option is 1. how I can save the data in container? Also am I going to loose all the data when I re-create some part of the application and use docker compose build again?

If option is 2., how I can access local mongodb inside containers(mongodb://localhost:27017/db_name)? Probably with shared network?

I would appreciate any kind of best advices how should proceed in this situation when I decided to dockerize application some time after it is already in use and have already collected big amount of data.

3

Answers


  1. Do a mongo dump of the local database and restore it in the docker service.

    How to do this is basically explained in mongodb documentation about backup and restore

    When restoring the database to your docker version, it would be a good idea to only keep the mongo running and start the other services only after a successfull restore.

    Login or Signup to reply.
  2. You can follow these steps to restore data in MongoDB that is dockerized.

    Taking dump from local MongoDB:

    • mongodump --db database_name --collection collection_name

    To restore dumped data to the docker container

    1. Copy dumped files to the docker container

      • docker cp dump_files_path container_id:/tmp/.
    2. Connecting to mongo shell in the docker container

      • docker exec -it container_id bash
    3. Restoring data to MongoDB in the docker container from mongo bash

      • mongorestore --db database_name /tmp/dumped_files

    This will restore the data to MongoDB that is running in the docker container.

    Note:

    If there is some data in MongoDB which is running in the docker container, which is the same as in your local dumped data, that will be skipped while restoring the data to MongoDB and the rest will to copied.

    Login or Signup to reply.
  3. Using mongodump is a good solution. You can also just copy the raw data files. After shutting down your existing Mongo instance, you can copy all of the files from /data/db (at least that’s the default data directory on Ubuntu) into your volume directory, ./mongo-db.

    Then when you start the container, your data should be there. Except, I believe your volume definition is wrong, and should be - ./mongo-db:/data/db. Once that’s fixed, you will likely need to destroy and recreate the container (rather than just restarting it).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search