Docker for local web development, introduction: why should you care?Last updated: 2021-07-11 :: Published: 2020-03-04 :: [ history ]
In this series
- Introduction: why should you care? ⬅️ you are here
- Part 1: a basic LEMP stack
- Part 2: put your images on a diet
- Part 3: a three-tier architecture with frameworks
- Part 4: smoothing things out with Bash
- Part 5: HTTPS all the things
- Part 6: expose a local container to the Internet
- Part 7: using a multi-stage build to introduce a worker
- Part 8: scheduled tasks
- Conclusion: where to go from here
Subscribe to email alerts at the end of this article or follow me on Twitter to be informed of new publications.
In this post
Who this series is for
Among developers, exposure to Docker ranges from having vaguely heard of the technology to using it on a daily basis, the latter category singing its praises while the former is sometimes still struggling with the sheer concept of containers.
Wherever you are on your journey, as a developer there are many reasons why you might want to delve into this technology, including, but not limited to:
"I read a few things about Docker and I am now looking for something more hands-on"
"I use a Vagrant-based solution like Homestead and it starts to show its limits"
"I already use a turnkey solution like Laradock and I want to understand how things work under the hood"
"I want greater control over my development environment"
"I want to better understand application architecture"
"My current project uses microservices"
Whatever your motive is, you might notice the emergence of a theme, revolving around understanding and being in control. In my opinion, Docker is about empowering the developer, from providing a safe environment to try out technologies easily – without messing up your local environment – to opening the gates to the world of Ops.
The aim of this tutorial series is to demystify a technology that can feel daunting at times, all the while providing most of the tools a developer might need to build a web project locally with Docker. It strictly focuses on the development environment and does not cover deploying Docker applications whatsoever, even though the end result can provide the foundation for a deployable setup.
No prior knowledge of Docker is required, but being comfortable around a terminal will help, as well as being familiar with Git over SSH.
We will start our journey with a basic LEMP stack and gradually increase the environment's complexity as we go, covering more and more use cases along the way. While the focus is mostly put on PHP on the server side, the overall principles can be applied to any language.
You might find yourself stopping at a certain point because all your needs have been covered already, only to come back later when you need more. Wherever that is, each part of the series comes with its own Git branch, which you can fork or download and use as a starting point for your own projects.
That's it for the elevator pitch. Convinced already? Great, let's move on to the first part. If not, let's try to address some of the doubts you may still have, starting with a little background check.
"Why should I listen to you?"
I am a backend developer with more than a decade of experience, having worked with various types of companies in three different countries. I witnessed the evolution of the development environment landscape over time, from the early days of using WAMP and Notepad++ to migrating to full-fledged IDEs and custom virtual machines, before moving on to Vagrant and eventually, Homestead.
Back in 2015, I explored the possibility of using Docker locally as a development environment and wrote about it in a couple of articles that were picked up by the Docker newsletter at the time, but eventually moved away from it mostly because of performance issues.
I was forced to revisit the idea at the beginning of 2018, however, as Vagrant started to show its limits on a project I was working on back then (more on that in the next section). As I looked into Docker again, I realised progress had been made both in terms of performance and adoption.
After tinkering about for a few days, I came up with a decent development environment capable of handling the application's complexity and, as I soon realised, greatly simplifying the onboarding of new developers.
Since then, I successfully implemented and improved upon a similar setup in several companies.
"I already use Vagrant/Homestead"
Homestead is great. I used Laravel's pre-packaged virtual machine (VM) for years, both for personal and client projects. It features pretty much everything you need for a PHP application, and allows you to spin up new websites very quickly.
So why did I make the switch?
The short answer is microservices.
Back at the beginning of 2018, I was tasked with exposing an API from a legacy monolith to serve a new Single Page Application, and to progressively phase out the monolith by extracting all its business logic into microservices. Happy days.
All of a sudden, I had to manage some legacy code running on PHP 5.x on the one hand, and some microservices running on PHP 7.x on the other.
I initially got both versions of the language running on the same VM but it involved some dirty workarounds that made the overall user experience terrible. Besides, I would eventually need to add other microservices with different stacks, and managing them all under the same VM wasn't a realistic endeavour.
I briefly tried to run separate Vagrant boxes for each microservice, but running the whole setup was far too heavy for my machine and managing things like VM-to-VM communication felt very cumbersome.
I needed something else, and that something else was Docker. But how does it help the situation?
One of the promises of Docker is to provide isolated environments (containers) running on a single virtual machine, starting in a few seconds. In my case, that meant replacing all of my heavy Vagrant virtual machines with a single, super-fast virtual machine featuring Docker, and run all my microservices on top of it, each in its own isolated container.
Using Docker's very own logo in an attempt to illustrate this, it would be like going from each crate requiring its own dedicated whale to carry it, to a single whale carrying all the crates like shown on the picture.
Now revisit the previous sentence and replace "whale" with "virtual machine" and "crate" with "microservice" - you should get the idea.
While being an overly simplistic explanation, this right there is what made it click for me. As a developer, this use case made perfect sense and impacted the way I worked on a day-to-day basis, way more than trying to understand how virtualisation or shared operating systems work.
Does that mean you should use Docker locally only if your application involves microservices?
The answer is yes and no. The more complex the application – the more moving parts it is composed of – the more likely you will need a solution like Docker.
But even if your application is rather simple, starting and stopping some containers is way faster than booting and halting a virtual machine, and in the eventuality of your application evolving in some unexpected ways (like they always do), adopting Docker from the get go gives you the confidence that your setup is future-proof.
Besides, instead of using a pre-packaged virtual machine like Homestead, featuring way more tools than any given application has any use for, using Docker the way I suggest ensures that you only ever instal what you actually need, in a proactive way. You regain control of your environment.
"How about [insert favourite Docker-based solution]?"
There is already a number of Docker-based development environments out there, and it seems that a new one pops up every other week.
Laradock was arguably the first to get some traction, and is to Docker what Homestead is to Vagrant: a pre-packaged environment for PHP applications. While I don't personally use it, I have heard a lot of good things and it might just be enough for your needs. Their motto is Use Docker First - Then Learn About It Later, which is an excellent approach in my opinion.
The purpose of this series, however, is to give you a deeper understanding of Docker so you can build your own setup, one that matches perfectly any given project's requirements. If all you want to do for the moment is using a Docker-based environment without the hassle of setting it up yourself, by all means give one of the aforementioned solutions a try, and feel free to come back whenever you want to create your own, tailored setup.
All in all I see this explosion of Docker-based environments as a good sign – a confirmation that this is the right tool for the job, one that will be embraced more widely in the future.
"Serverless is the way to go"
Meanwhile, others argue that the comparison is moot because the two technologies serve completely different purposes.
As far as local development is concerned, the comparison is mostly irrelevant – serverless or not, you will need a local environment to build your projects, and whatever their specifications are Docker will fit the bill.
"Performance is terrible"
It used to be terrible. Now it's largely OK.
Performance on Linux was always fine, and since the release of WSL 2 it's fine on Windows, too.
That leaves us with macOS, where speed remains a pain point depending on the context, due to the poor performance of the underlying file-sharing implementation (gRPC-FUSE). There are ways to make it better, however, which are mentioned throughout this series.
While you're unlikely to get the same kind of speed as Homestead or something like Valet, it will be acceptable.
"It is not my job to deal with Docker"
Finally, some developers simply dismiss Docker as not being their problem. This is a fair point, as Docker sits somewhere beyond the realm of pure application development. If anything, a sysadmin or DevOps engineer should set it up for you, right?
While you are entitled to feel that way, DevOps not only refers to individuals with an interest for both system administration and coding, effectively acting as bridges between the two, but can also be viewed as people from both sides meeting halfway.
In all fairness, you don't need to know Docker to be a good developer. My point is simply that by ignoring it, you are missing out on an opportunity to get better.
It won't improve your syntax nor teach you new design patterns, but it will help you to understand how the code you write fits into the bigger picture.
It is a bit like playing an instrument without learning to read sheet music: you might sound good, but you are unlikely to write a symphony. Taking a step back to reflect on the application's architecture to understand how the different pieces fit together will give you invaluable perspective that will influence the technical choices you make for your application.
Whatever your specialty – backend, frontend or fullstack – and to whatever extent you feel concerned with the DevOps movement, I promise that learning about the cogs that make your applications tick is worth your time, and that you will be a better developer for it.
It's all about the journey
Ultimately, you may not fully embrace the final result of this series. Be it for performance reasons as mentioned earlier, or because you feel some aspects of the setup are too complicated or don't work for you, this development environment may not entirely satisfy you in its suggested form.
But that is not the true purpose of this series. While you can take my approach at face value, my main objective is to give you enough confidence to integrate Docker to your development workflow in any way you see fit – to provide enough practical knowledge to adapt it to your own use cases.
And I can guarantee that by the end of this series, you will have plenty of ideas as to how to use Docker for your own benefit.