What is Docker and why do you want to use it in your work?

  • 8 March 2020
  • Reading time: 4 min
  • Hosting

Docker most popular container sizeOver the past few years, Docker has become the leading standard in 'container technology' – a new method of packaging and deploying software, which is often self-written. This way, you no longer have to take hardware and specific configuration settings into account. In addition, you can use the physically available computing power in a much more efficient way. Does this sound interesting to you? Just keep in mind that this solution requires an entirely different way of working.

Thanks to the DevOps approach, we tend to see software development and infrastructure management more and more as a single entity. The settings for security, availability and performance are put together with your software in one box: the Docker container. Deploying software thus becomes very straightforward – a bit like when you serve a fully prepared meal, instead of having to cook all the separate ingredients together over and over again.

 

Why would you put software in a container?

When you write software yourself, you never really start from scratch. You can rely on existing elements, such as the operating system, libraries, drivers, plugins, runtime environments... Otherwise, it would be impossible to get your application to work.

In a conventional approach, you first need to install all the layers on top of the operating system itself – on your own computer, server or virtual server. Is your application finally ready – after a great deal of thinking, programming, testing and optimising? You can now repeat the whole process in the live environment. To sum up, the traditional method has a lot of drawbacks:

  • A lot of hassle for nothing: The installation of your infrastructure and everything that comes with it is repetitive work, which you as a software developer would prefer to avoid. Moreover, if you want a second or third server, for performance or reliability reasons, you have to do the job all over again.
  • Risk of errors: Minor version differences between the development, test and live environments can have a major impact. What seems to work flawlessly all the time can thus suddenly start causing problems at the moment of the go-live.
  • Limited resources: In a traditional ICT approach, it is the hardware that determines how fast or slow your system operates. If you want more computing power, you have to add resources. Do you have too many resources? Then, they will remain unused. Do you use virtualisation to combine multiple virtual servers together on the same hardware? Then, the number of operating systems, libraries, etc. will increase and you will consume all your computing power.

Containers are a lot more compact and contain only the bare essentials. They use your resources much more efficiently. You can save the desired configuration as an 'image' – a kind of 'picture' of the entire installation. You can then deploy it once, three times or even dozens of times. It is quick and easy, and can even be done fully automatically using a container platform such as Kubernetes. By putting all the necessary things together in a container, not only do you eliminate the manual installation of your system, but you also eliminate the associated risk of human error.

 

How exactly does Docker work?

Docker image

In your Docker container, you can put everything your application requires to run. No less, but also no more. All the settings are included in the box. It is a fully autonomous mini system, which you only have to start or stop.

Everything your application does not require, does not have to be included. This, together with the shared use of underlying libraries and operating systems, ensures a very efficient use of your resources.

Building, packaging and publishing your software is completely different with Docker. You can fine-tune the code and the infrastructure settings until everything is completely to your liking. Then, you take a 'photo' (image). Just as you can print a photo as many times as you like, you can use the image in one or more containers. If you want to edit the code or the settings – even if only slightly – all you have to do is repeat the whole process:

  • Docker file: This simple text file serves as a 'blueprint' and describes what your desired 'Docker image' will look like.
  • Docker image: As a developer, you can often build on an existing basic image that contains the desired tools. On DockerHub, for example, you will find a ready-to-use environment for Ruby or NodeJS. If you like the result, you can capture (build) your image and publish it.
  • Docker container: Based on your image, standard or self-built, you start up (run) a stand-alone system. This process is repeatable and unchangeable.

 

What are the pros and cons of using Docker?

Do you install your application on multiple servers at once? Do you often deploy new versions? In that case, you have good reason to switch from a 'traditional' to an 'industrial' method. This, however, requires a different approach, which is typical of the DevOps philosophy. You will have to take some time to get used to it before you can start enjoying the benefits:

  • Manageable: Any system in a Docker container can be stopped, started and replicated right away. If a container crashes, it will not affect other systems, even if they are physically running together on the same host machine.
  • Scalable and flawless: If you put several containers next to each other, you can better distribute the traffic in the event of a sudden peak. If you also use orchestration, you will be able to automatically start up additional nodes and restart the system that crashed.
  • Platform independent: Is your application dependent on specific versions, configurations or web services on another system? Interdependencies, especially in the case of larger applications, can turn a simple system upgrade into a real headache. With Docker, you are no longer dependent on any particular system. You can use the proper version in each container, so you do not have to upgrade everything at once.
  • High-performance: Docker puts each container directly on top of the host operating system – the only operating system. Containers have access to shared data storage. Files used by multiple containers are stored only once. This helps reduce ‘overhead’ quite considerably.

Docker works in a fundamentally different way than a virtual machine (VM): a VM actually acts as conventional hardware with processors, memory and storage. If necessary, you can distribute the available resources beforehand across multiple VMs, and a full operating system with all its components will then be running on each of them. Moreover, a hypervisor (under which another operating system is installed) ensures that the VMs are properly isolated from each other. This allows you to divide your computing power into many small bits. In short, VMs are less rigid than dedicated hardware, but they are also very resource-hungry and not as flexible. Several studies report that containers require up to five times less computing power (for the same workload) as VM technology.

The biggest challenge in using Docker is that you have to be able to deal with it both conceptually and technically. If DevOps is not your cup of tea, and if you find microservices too complicated, Docker may not be your best choice after all.

  • Older applications: Most older applications are simply not designed to be run in containers.
  • Compact code: A Docker container contains the bare essentials. In order to keep the contents of the container simple and easy to maintain, many people choose to partition a complex application into separate functional components called microservices. If you do not want to use microservices, or if you are stuck with architectural choices from the past, containers may not be your best option.
  • Technical knowledge: DevOps and microservices are excellent choices when it comes to writing a new application based on contemporary architecture. Of course, you must also understand these concepts inside out.

 

Combell & Docker

Do you see containers as a possible building block of your ICT strategy? Do you think Docker could be an asset to your business? Or would you like to discuss your choice first? Combell specialists are available to provide you with tailored advice and offer you a wide range of managed container services.

Combell is your point of contact for high-performance hosting services, with security and availability guarantees. Our infrastructure services allow you to better focus on your new application or your online business.

Find out more about Combell's managed container services