Skip to content

Instantly share code, notes, and snippets.

@toolness
Last active March 26, 2019 15:21
Show Gist options
  • Save toolness/bfe2a15e23283bee31feccd2a1a9c9e3 to your computer and use it in GitHub Desktop.
Save toolness/bfe2a15e23283bee31feccd2a1a9c9e3 to your computer and use it in GitHub Desktop.
Reflections on Docker-based development

Reflections on Docker-based development

Note that these reflections are specifically tailored to a conversation about Docker we're having at 18F, and as such they have a few assumptions:

Advantages

Dev/prod parity ... sort of

Unlike vagrant, a big promise of Docker is that it's not just intended for development purposes--it's also intended for deployment, because containers are so good for process isolation and resource management. This means that we can ideally have great dev/prod parity in theory. In practice, things are a bit more complicated, especially since cloud.gov currently calls its Docker support an experimental feature.

Obviates an entire class of environment tooling

Most development tools are built to assume that they are a "singleton" in the content of the system they're installed on.

But as soon as you have two projects that require different versions (or configurations) of that tool, you start needing another tool that manages the version or configuration of that tool for you, so that each project can use the version or configuration that it needs. This is how tools like nvm (for node), rvm (for ruby), virtualenv/pyenv (for python) and such come into existence. It adds a lot of cognitive overhead to the development process.

Containers get rid of this problem entirely--but not without introducing new cognitive overhead that developers need to understand. At least one benefit of the new overhead, though, is it's generic enough to apply to all kinds of problems, rather than being specialized to a particular type of development tool.

Reduces setup time

Installing Docker on OS X is easy, and as the CALC docker instructions attest, setup largely boils down to git clone followed by docker-compose up, peppered with a few manual tasks.

Another nice thing about docker-compose up is that it starts all services in a single terminal window and prefixes their output with their container name. This is already a lot more convenient than manually opening a separate terminal window for every dependent service, which is what non-Docker setups often make developers do.

Ease of one-off deployment to the cloud

Because we're not allowed to use tools like ngrok to expose our development instances to coworkers at 18F, being able to conveniently deploy our work to a temporary Amazon EC2 instance becomes important. Fortunately, thanks to docker-machine, this isn't hard; see CALC's guide to deploying to cloud environments for more details.

References

  • Atul's January 2017 Docker presentation
@msecret
Copy link

msecret commented Jan 31, 2017

Thinking of Changing python/node/ruby dependencies can be cumbersome, have you ever tried to use npm link with docker? For instance, cloudgov dashboard is running it's front end build watch in docker but for it's cloudgov-style dependencies in package.json, we need to npm link to the local cg-style on our dev computers. Is this possible/hard?

@toolness
Copy link
Author

@msecret So that's possible, I think, but one thing to be very careful about is that you not build any binary dependencies outside of a container, i.e. on your host system--because those dependencies would then be built for OSX/darwin, not linux. This actually happened with some of us on CALC for a bit and it was suuuuper confusing.

But anyhow, TL;DR is that there's definitely ways to work around it... it's just annoying that's a headache that's not really present in non-Docker development.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment