I like Docker idea since it first appeared and wasn’t so popular like now. For me it was a great replacement of Vagrant for local development. But I cannot understand why people suggest use docker on production server.
For me is obvious that Docker mostly for emulate of production server environment on the local machine. So every developer who involved to project will have the same environment as on production server. So Why do people use Dockers for usual things like php and nodejs on the production servers?
Why? They are buying servers randomly without knowing of what OS will be there or they cannot install php by "apt install"? Does it make sense to use a virtual machine (Docker) under another virtual machine (VPS)? More funniest thing for me is when somebody gets Ubuntu server and install docker with Ubuntu and mysql+php+nginx…
Could anyone explain me some cases which I don’t probably know? What about additional cost of processor time for docker containers? Etc.
Thanks a lot!
2
Answers
The problem here is that you seem not to have seen IT at scale. Installing PHP is easy on a single server, but when you’ve got twenty-thousand of them it becomes problematic. And updating them all to keep them in sync… Of course, we don’t often need 20k PHP servers (in fact, we don’t even need one, but we can save that for another rant). We have a couple of thousand java servers, and a couple of thousand running python, some more with other proprietary software (perhaps exchange, active directory, maybe sharepoint), others running security software, logging and monitoring, and so on. Keeping all of this up to date and working is a full-time job for teams of people. And that’s without even going into the dev/test/prod categories and trying to keep those environments in sync.
How do we manage access? How do we keep machines in sync if anyone can log on an install anything? How do we keep track of what’s on each machine? How do we patch all machines with, say, PHP on them? Do we even know which machines have PHP on them?
I’ll talk about docker, but this applies to containers in general rather than just docker specifically.
The industry has evolved through manual installs, through bespoke scripting, and more formal scripting mechanisms, towards a declarative approach involving containers. The key benefit of containers is standardisation. Now that we have a standard interface for all software operations we can build standardised tools to manage it. That’s what Kubernetes is — a standardised tool for scheduling (deploying and running) software.
Once we had a standardised model for deployment (the container) we could start to automate the operations of all containers in the same way and we could start to refactor ‘platform’ behaviour out of our applications and into the platform where it belongs. Logging, monitoring, security, encryption, circuit-breaking and other error-handling, events, etc. are now all provided by the platform so that your app can focus on business logic. This enables the capability we call microservices and brings in an era of truly distributed computing.
Overall, the use of docker on the desktop is trivial compared to its benefits on servers. Not that the impact to developers is trivial — I run ubuntu sometimes and windows sometimes, but my development environment doesn’t change between the two because all my work is in git and my tools are in containers in my docker-hub account. I have a consistent work environment because of containers. And that’s the feature you’re referencing in your question.
Though, in your question you miss an important point. What makes you think that having PHP installed in a container on your desktop makes it the same as the server you’re using for prod? They are literally different environments. Overnight, an engineer could have logged onto that server and patched the PHP version so that you’re now out of date with it. They may have had good security-related reasons for doing so, but you’re still out of sync.
The solution to this is to run the same container in prod that you’re using on your desktop. If that container is the latest version, and the way it’s patched is to have the ops guy update the container definition and push it to the company repo which then rebuilds it and pushes the binary to your company docker registry, then all you have to do to ensure that you’re in sync with the server is to pull the latest container, which docker will do for you automatically in most environments. Now imagine that process over, say, a thousand servers with hundreds of different bits of software — servers, monitors, log transporters, security, and even the operating system — that must all be patched consistently across all machines, and you’re starting to get an idea of why we put docker on the server too.
‘Software Engineer’ answered the question while I was writing this and probably answered better than me, but here are a few reasons that I can think of off the top of my head:
docker run ...
Even more benefits when you deploy to an orchestrator like Kubernetes e.g.
Let me know if you want me to go into any of these in more depth…