Unlike vagrant, a big promise of Docker is that it's not just intended for development purposes--it's also intended for deployment, because containers are so good for process isolation and resource management. This means that we can ideally have great dev/prod parity in theory. In practice, things are a bit more complicated, especially since cloud.gov currently calls its Docker support an experimental feature.
Most development tools are built to assume that they are a "singleton" in the content of the system they're installed on.
But as soon as you have two projects that require different versions (or configurations) of that tool, you start needing another
tool that manages the version or configuration of that tool for you, so that each project can use the version or configuration
that it needs. This is how tools like nvm
(for node), rvm
(for ruby), virtualenv
/pyenv
(for python) and such come
into existence. It adds a lot of cognitive overhead to the development process.
Containers get rid of this problem entirely--but not without introducing new cognitive overhead that developers need to understand. At least one benefit of the new overhead, though, is it's generic enough to apply to all kinds of problems, rather than being specialized to a particular type of development tool.
Thinking of
Changing python/node/ruby dependencies can be cumbersome
, have you ever tried to usenpm link
with docker? For instance, cloudgov dashboard is running it's front end build watch in docker but for it's cloudgov-style dependencies in package.json, we need to npm link to the local cg-style on our dev computers. Is this possible/hard?