skip to Main Content

I’m new to Docker and Node.js and I’m trying to set up a Node.js application using Docker. I prefer using Docker over nvm to manage my Node.js version. Here’s the issue I’m facing:

Steps I’ve Taken

  1. Initial Dockerfile Setup:

    • Created a Dockerfile in my project directory with the following content:
      Dockerfile FROM node:14 USER node WORKDIR /app
    • Built and ran the container, mapping the volume with -v ${PWD}:/app.
    • Opened a shell on the container using docker exec.
    • Ran npm init --yes && npm i mongoose, which created node_modules both in the container and in my local repository.
  2. Modified Dockerfile:

    • Updated the Dockerfile to perform npm install, expose ports, and add other necessary commands as I proceeded. This setup worked fine, but it left a node_modules folder in my local directory.

Alternative Approach

  • Instead of using the shell to execute npm init and npm install, I wrote a package.json file directly.
  • Created a standard Dockerfile with FROM, WORKDIR, COPY, and EXPOSE. I didn’t include the CMD as there’s no JS file to run yet.
  • This raised an issue: after running the container, the node_modules folder wasn’t created in the directory. I suspect this is due to mounting the volume with -v ${PWD}:/app.

Question

How can I properly set up my Node.js application with Docker so that the node_modules folder is created and used inside the container without it being stored locally?

Thank you for your help!

What I Tried:

  1. Initial Dockerfile Setup:

    • Created a basic Dockerfile with the following content:
      FROM node:14
      USER node
      WORKDIR /app
      
    • Built and ran the container using:
      docker build -t my-node-app .
      docker run -v ${PWD}:/app -it my-node-app
      
    • Opened a shell in the running container:
      docker exec -it <container_id> /bin/sh
      
    • Ran:
      npm init --yes && npm i mongoose
      

      This created a node_modules folder both in the container and in my local directory.

  2. Modified Dockerfile:

    • Updated the Dockerfile to:
      FROM node:14
      WORKDIR /app
      COPY package*.json ./
      RUN npm install
      COPY . .
      EXPOSE 3000
      CMD ["node", "index.js"]
      
    • Rebuilt and ran the container. This setup worked, but the node_modules folder was still present in my local directory due to the volume mapping.
  3. Alternative Approach:

    • Wrote a package.json file directly with the required dependencies.
    • Created a standard Dockerfile:
      FROM node:14
      WORKDIR /app
      COPY package*.json ./
      RUN npm install
      COPY . .
      EXPOSE 3000
      CMD ["node", "index.js"]
      
    • Ran the container with:
      docker build -t my-node-app .
      docker run -v ${PWD}:/app -it my-node-app
      
    • This time, the node_modules folder was not created in my local directory, but it also wasn’t available in the container.

What I Expected:

I expected the node_modules folder to be created and used inside the Docker container without being present in my local directory. This way, I can manage dependencies within the container, keeping my local workspace clean. I am looking for a solution to achieve this.

2

Answers


  1. You are right, node_modules should not be stored just like in other languages:

    • /bin (dlls) for c#
    • /target (jars) java

    Your approach #3 is the correct, just don’t mount the volume.

    Clean as possible

    Your Dockerfile could be simple like this

    FROM node:22
    WORKDIR /opt/
    COPY . /opt/
    RUN npm install
    RUN npm run build
    EXPOSE 8080
    ENTRYPOINT ["npm","run","start"]
    

    Don’t mount nothing

    You can do whatever you need on your localhost but for real nodejs applications on servers, your container should be portable and disposable, so the disk should not be used as datasource (json, xml, data) or file repository (csv, reports, pdfs, logs, etc).

    • If you need to store files, I suggest you to use some specialized service like aws s3, azure blob, gcp cloud storage, etc.
    • For data you have a lot of engines (mysql, postgres, etc).
    • For logs: aws cloud watch, gcp (Cloud Logging, Stackdriver), etc

    References

    Login or Signup to reply.
  2. You’re launching your container with:

    docker run -v ${PWD}:/app -it my-node-app
    

    This means that you’re replacing the entire contents of the /app folder in the container with what appears in your current working directory on the host.

    This will not be conducive to having just the source files mapped (e.g. /app/app.js) but somehow not mapping the node_modules folder (i.e.: /app/node_modules), because the node_modules folder is within the folder that you have mounted to the host.

    Restructure your folders

    Instead, I suggest restructuring your folders so that the source files you want to map between host and container are their own folder, and then mount only that source folder.

    This should be fine because modifying package.json will require a new npm install within the container whereas source .js file changes can be picked up on the fly. Especially if you want to use nodemon to automatically pick up changes.

    Old structure:

    │   app.js
    │   package-lock.json
    │   package.json
    │
    └───node_modules
        │   .package-lock.json
    

    New structure:

    │   package-lock.json
    │   package.json
    │
    ├───node_modules
    │       .package-lock.json
    │
    └───src
            app.js
    

    Launch like this:

    docker run -v ${PWD}/src:/app/src -it my-node-app
    

    Of course, you will have to change your main property in your package.json to reflect the new location of the source files. For example:

    From:

    "main": "app.js"
    

    To:

    "main": "src/app.js"
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search