skip to Main Content

I have a NodeJS application that is using ioredis to connect to redis and publish data and other redisy things.

I am trying to write a component test against redis and was able to create a setup/teardown script via jest that runs redis via docker on a random port and tears it down when the tests are done via docker run -d -p 6379 --rm redis and docker stop {containerId}.

This works great locally, but we have the tests running in a multi-stage build in our Dockerfile:
RUN yarn test

which I try to build via docker build . it goes great until it gets to the tests and then complains with the following error – /bin/sh: docker: not found

Hence, Docker is unavailable to the docker-build process to run the tests?

Is there a way to run docker-build to give it the ability to spin up sibling processes during the process?

2

Answers


  1. This smells to me like a “docker-in-docker” situation.

    You can’t spin up siblings, but you can spawn a container within a container, by doing some tricks: (you might need to do some googling to get it right)

    • install the docker binaries in the “host container”
    • mount the docker socket from the actual host inside the “host” container, like so docker run -v /var/run/docker.sock:/var/run/docker.sock ...

    But you won’t be able to do it in the build step, so it won’t be easy for your case.

    I suggest you prepare a dedicated build container capable of running nested containers, which would basically emulate your local env and use that in your CI. Still, you might need to refactor your process a bit make it work.

    Good luck 🙂

    Login or Signup to reply.
  2. In my practice, tests shouldn’t be concerned with initializing the database, they should only be concerned about how to connect to the database, so you just pass your db connection data via environment variables.

    The way you are doing it it won’t scale, imagine that you need a lot more services for your application, it will be difficult and not practical to start them via tests.

    When you are developing locally, it’s your responsibility to have the services running before doing the tests.

    You can have docker compose scripts in your repository that create and start all the services you need when you start developing.

    And when you are using CI in the cloud, you would still use docker containers and run tests in them( node container with your tests, redis container, mysql container, etc…) and again just pass the appropriate connection data via environment variables.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search