Organization wants to get on the Docker train. Is it worth it?

I work for a private company that offers a Linux appliance as well as a custom suite of applications that run atop that appliance. The application suite consists of about 13 different C++ applications with one being a GUI (Qt) application and another that runs as a service and controls all the other components from the GUI via IPC. The Linux appliance is just a Debian install running on specialized hardware that meets the specific needs of our customers.

Currently the developers work on their own development machines in git and when the application suite needs to be built for testing, we use Jenkins to clone the repo and then build each application sequentially in a single pipeline. We are working on a way to make the individual applications build in parallel but we are trying to resolve some dependency issues (one application requiring another to already be built before it can build for example) in order to get that to work effectively. This gives us a build time of over an hour. And if the second to last application fails the build, then we just wasted a bunch of time only to uncover a problem that needs to go back to the developers. Not ideal. Once the suite is built, it's sent off to QA who load the suite on their own appliances and start executing their tests. There are no web applications, the suite is not public or even downloaded, they're all provided to the customer by production/sales as the Linux appliance with the application suite already installed.

Our company has started to take a more Agile approach to things, so we're really trying to improve the way we do business. Among other things there are a lot of rumblings about how we should try and incorporate Docker into our work flow to make things easier. I have been reading the docker.io docs and googling non web-app use cases, but I can't seem to understand how Docker could realistically fit into our current environment.

There are a couple common benefits I've read about that I don't think apply to our organization.

Standardization. Removes the 'works on my machine' problem.

The developers use their own development environment on top of the Linux appliance. When the application suite is built and sent to QA, they load the suite on their own appliances simulating what the end-user would have. I don't think Docker would benefit this situation because the appliances would need to have Docker Engine running in order to use the Docker containers and that is not how the appliance will be provided to the customer.

Makes Deployment Faster/Easier

We don't actually deploy anything externally. We 'deploy' the application suite internally via Ansible to those who want it on their appliance, but we're not at such a large scale where QA can't just copy the suite and install it on their machines. We don't deploy to servers or the cloud, everything is on the appliance itself.

The only real benefit I think Docker may have is during the build process, if we're able to resolve the dependency issues that affect some of the applications. I'm all about trying new things, but they usually have obvious benefits/ROI for trying them out. I just can't see that with Docker. Is there anything else that I'm overlooking or just not seeing?

1 answer

  • answered 2018-07-11 05:22 Teemu Risikko

    This might be the wrong stackexchange, but here's some thoughts from my own work. Not a complete list, but some obvious benefits I have noticed.

    CI

    Build / CI-process is something where you would definitely benefit from using containers. I assume you have build servers to build your software. Consider that you introduce a new tool/library that is needed to be available on the servers. Now you mention that you use Ansible, so you already might have a solution for this, but if not, you would be able to change just one Dockerfile, rebuild the container and either deploy it to the build machines, or use it directly during the build itself.

    Dependecy issues

    Now you say you have some dependency issues. One problem I have often seen with this kind of C++ setups is that few people know what is actually needed in the environment for the build. How is your Jenkins setup? Often it is a static piece of software that nobody wants to touch because some guy that left company years ago provisioned the build servers and created the Jenkins jobs. With Dockerfiles in version control, you would have total history and documentation over what is needed in the environment, and how those requirements have changed.

    Load balancing

    Docker also helps with scaling and load balancing of build servers. Again, often with C++ you want to run some QA tools for your code, something like Valgrind or some static analysis tool. Most of the time these applications hog the whole CPU- or RAM-capacity of the machine they are running in. With docker, you could have for example ten identical replicas of the same Docker image running, and you can relatively easy limit their resources. This would help you to parallelize your build process. Scaling the amount of executors up and down would also be really easy.

    Setup initialization

    What happens when a new developer joins the company? Do they have to search from dusty documentation or wait for Bob to come from vacation to explain what is needed for their machine in order to start building. Again, you might have solved this problem with for example Virtual machines, but then again, you could tell them to install the IDE of your/their choice and docker, and then provide the Dockerfile along with the source code. That Dockerfile would contain all the other dependencies and would immediately be able to build the software with that image. With the Dockerfile version controlled with the source code, you would also most of the time know what was needed to be available at what point of the application history.