Docker for local web development, introduction: why should you care?Last updated: 2020-03-05 :: Published: 2020-03-04 :: [ history ]
In this series
- Introduction: why should you care? ⬅️ you are here
- Part 1: a basic LEMP stack
- Part 2: put your images on a diet
- Part 3: a three-tier architecture with frameworks
- Part 4: smoothing things out with Bash (coming soon)
- Part 5.1: HTTPS all the things with a self-signed certificate (coming soon)
- Part 5.2: HTTPS all the things with a trusted Certificate Authority (coming soon)
- Part 6: expose a local container to the Internet (coming soon)
- Part 7: using a multi-stage build to introduce a worker (coming soon)
- Conclusion: where to go from here (coming soon)
- Appendix A: Docker on Windows
Subscribe to email alerts at the end of this article or follow me on Twitter to be informed of new publications.
In this post
Who this series is for
The first version of Docker was released in 2013 and, since then, it has worked its way up to eventually becoming the industry standard for containers. Among developers, exposure to Docker ranges from having vaguely heard of the technology to using it on a daily basis, the latter category singing its praises while the former is sometimes still struggling with the sheer concept of containers.
Wherever you are on your journey, as a developer there are many reasons why you might want to delve into this technology, including, but not limited to:
"I've read quite a bit about Docker and I am now looking for something more hands-on";
"I use a Vagrant-based solution like Homestead and it starts to feel like it's getting in the way";
"I already use a pre-configured Docker environment like Laradock and I want a deeper understanding of how things work under the hood";
"I want greater control over my development environment";
"I want to better understand application architecture";
"The project I am working on uses microservices";
"I am supposed to migrate a Wordpress website and I need a distraction".
Whatever your motive is, you might notice the emergence of a theme, revolving around understanding and being in control: in my opinion, Docker is about empowering the developer, from providing a safe environment to try out technologies easily, without messing up your local environment (yes, even Wordpress), to opening the gates to the world of Ops.
The aim of this tutorial series is to demystify a technology that can feel daunting at times, all the while providing most of the tools a developer might need to build a web project locally with Docker. It strictly focuses on the development environment and does not cover deploying Docker applications whatsoever, even though the end result can be used as part of a deployment process.
No prior knowledge of Docker is required, but being comfortable around a terminal will help.
We will start our journey with a very basic LEMP stack and gradually increase the environment's complexity as we work our way through the different parts of the series, covering more and more use cases along the way. While the focus is mostly put on PHP on the server side, the overall architecture can be adapted for any language.
You might find yourself stopping at a certain point because all of your needs have been addressed already, only to come back later when you need more. Wherever that is, each part of the series comes with its own branch from this repository, which you can fork or download and use as a starting point for your own projects.
That's it for the elevator pitch. Convinced already? Great, let's move on to the first part (or head over here if you intend to run Docker on Windows). If not, let's try to address some of the doubts you might still have, starting with a little background check.
"Why should I listen to you?"
I am a backend developer with more than a decade of experience, having worked with various types of companies in three different countries. I witnessed the evolution of the development environment landscape over time, from the early days of using WAMP and Notepad++ to migrating to full-fledged IDEs and custom virtual machines, before moving on to Vagrant and eventually, Homestead.
Back in 2015, I explored the possibility of using Docker locally as a development environment and wrote about it in a couple of articles that were picked up by the official Docker newsletter, but eventually dropped the idea, mainly because of performance issues.
A the beginning of 2018, however, I was forced to revisit that thought because of Vagrant's shortcomings vis à vis the growing complexity of the project I was working on at the time (more on that in the next section), and realised progress had been made both in terms of performance and adoption.
After tinkering about for a few days and successive iterations, I ended up with a decent development environment running on Docker, capable of meeting the application's new scope and greatly simplifying the onboarding of new developers.
Since then, I successfully implemented and improved upon a similar setup in another company.
While the approach taken in this series is certainly not the only one and is most likely perfectible, it has proven to be reliable and a significant improvement compared to previous Vagrant-based setups.
"I am already using Vagrant/Homestead"
Homestead is great. I used Laravel's pre-packaged Vagrant virtual machine (VM) for years, both for personal and clients' projects (if you have no clue what Vagrant is, please take a quick look at this post first, especially this short section). Homestead features pretty much everything you need for a PHP application, and allows you to spin up new websites very quickly. So why did I make the switch?
The short answer is microservices.
Back at the beginning of 2018, I was tasked with exposing an API from a legacy monolith to serve a new SPA, and to progressively phase out the monolith by extracting all of its business logic into microservices.
All of a sudden, I had to manage some legacy code running on PHP 5.x on the one hand, and some microservices running on PHP 7.x on the other. I initially got both versions of the language running on the same VM but it involved some dirty workarounds that made the overall user experience terrible. Besides, I would eventually end up with multiple microservices with different stacks, and managing them all on the same VM wasn't a realistic long-term solution.
I briefly tried to give each microservice its own Vagrant box, but running everything together was far too heavy for my machine and managing things like intra-VM communication felt very cumbersome.
I needed something else, and that something else was Docker. But how does it help the situation?
One of the promises made by Docker is to provide isolated environments (containers) running on a single virtual machine, which starts in about five seconds. In my case, that meant replacing all of my heavy, Vagrant-based virtual machines with a single, super-fast virtual machine featuring Docker, and run all of my microservices on top of it, each in its own isolated container.
Using Docker's very own logo in an attempt to illustrate this, it would be the equivalent of having a single whale per container instead of the same whale carrying all of the containers. If you replace "whale" with "virtual machine" and "container" with "microservice" in the previous sentence, you should get the idea.
While being an overly simplistic explanation, this right there is actually what made it click for me: as a developer, this use case made perfect sense as it related to the work I do every day, way more than trying to understand how virtualisation or shared operating systems work.
Does that mean you should only use Docker locally if your application involves microservices?
The answer is yes and no. The more complex the application, the more moving parts it is composed of, the more likely you will need a solution like Docker. But even if your application is rather simple, starting and stopping some containers is way faster than booting and halting a virtual machine, and in the eventuality of your application evolving in some unexpected ways (like they always do), adopting Docker from the get go gives you the confidence your setup is future-proof.
Besides, instead of using a pre-packaged virtual machine like Homestead, featuring way more tools than any given application has any use for, using Docker the way I suggest ensures that you only ever install what you actually need, in a proactive way. You regain control of your environment.
"How about Laradock?"
Laradock is to Docker what Homestead is to Vagrant: a pre-packaged environment for PHP applications. While I don't personally use it, I have heard a lot of good things and it might just be enough for your needs. Their motto is Use Docker First - Then Learn About It Later, which is an excellent approach in my opinion.
That being said, the purpose of this tutorial series is precisely to get a deeper understanding of how to make Docker work for you. If all you are interested in for the moment is using a Docker-based environment without the hassle of setting it up by yourself, by all means give Laradock a try, and feel free to come back whenever you want to create your own, tailored environment.
"Serverless is the way to go"
Meanwhile, others argue that the comparison is moot because the two technologies serve completely different purposes.
I don't know what the future is made of. When it comes to technology, I like to consider myself a pragmatist: being a developer requires an awful lot of effort to stay up to date, and while I would love to explore the many technologies that arise pretty much on a daily basis, I simply don't have the time to do so. While the serverless movement has certainly been getting momentum, it is also still very much in its infancy, and I would like to see it mature further before I consider taking the plunge.
If serverless somehow ends up "eating the stack" and that is where all the jobs go, fine, I will make the switch. But I don't see such a market shift happening for another few years (if ever), and for the time being I'd rather focus on what is likely to make my CV attractive today.
Companies are already slow to migrate to containers despite high interest, and the industry only recently rallied behind Kubernetes, which is poised to become the standard of container orchestration. Serverless represents an even greater paradigm shift that is nowhere near such consolidation: thinking it will flip the market at the snap of a finger is foolish in my opinion.
That doesn't mean I won't use a Lambda function where it seems relevant, or that serverless shouldn't be on your watchlist. On that note, if you wish to explore the subject further, I found this article by ZDNet to be informative.
"I'm on Windows"
Ah mate, I feel you. You work on Microsoft's operating system, and you read somewhere that running Docker on it is a bit of a pain? Well, the bad news is, this is still largely true. The good news, however, is that most of what is covered in this series has been successfully tested on Windows 10 at some point, even though it does require a few tweaks here and there.
Nevertheless, running Docker on Microsoft's OS is tricky enough that I came up with a special appendix to help you get started. I recommend you read it after this introduction and before proceeding further, even though I regularly refer to it throughout the articles.
A caveat: I work on a Mac and don't have regular access to a Windows machine, so if any Microsoft-related trickery seems to be at play I'm afraid I won't be of much assistance (this is also valid for Linux distributions, although from experience these don't tend to cause any trouble).
I would also like to make a full disclosure: while it is true that everything covered in this series will also work on Windows, the overall experience is still likely to feel a bit brittle, especially from the moment Bash is involved. There is always a way to mitigate those frictions, but it generally takes a little extra TLC.
"It is not my job to deal with Docker"
Finally, some developers simply dismiss Docker as not being their problem. This is a fair argument, as Docker sits somewhere beyond the realm of pure application development. If anything, a sysadmin or DevOps engineer should set it up for you, right?
Well, yes, you are entitled to feel that way. But if DevOps can refer to individuals with an interest for both system administration and coding, effectively acting as bridges between the two, it can also be interpreted as people from both sides meeting halfway.
In all fairness, you don't need to know Docker to be a good developer. My point is simply that by ignoring it, you are missing out on an opportunity to get better. It won't improve your syntax nor teach you new design patterns, but it will help you to understand how the code you are writing fits into the bigger picture. It is a bit like playing an instrument without learning to read sheet music: you might sound good, but you are unlikely to write a symphony. Taking a step back to reflect on the application's architecture to understand how the different pieces fit together will give you invaluable perspective that will influence the technical choices you make for your application.
Whatever your specialty – backend, frontend or fullstack – and to whatever extent you feel concerned with the DevOps movement, I promise that learning about the cogs that make your applications tick is worth your time, and that you will be a better developer for it.
Now. Shall we get started?