Should you use docker

Should you use Docker? (Benefits and Usages)

Should you use docker

A lot of technologies emerge, but only few take a stand and prove useful to the IT community and Docker is one of such. Docker was released in 2013 and six years later, it’s still going strong. More organizations and techies are beginning to see its importance.

As at 2014, Google was stated to be using two billion Docker containers a week, imagine how much that would have increased as at this moment you are reading this article.

So, should you use docker? Yes. Docker provides a consistent environment for software from development, to testing and production environments and without requiring a lot of resources. It thereby makes deployment processes easier for software developers and system administrators when dealing with software packages.

In this article, you’ll find explanations on the concept of Docker to give you a better understanding of the technology. You’ll also learn about its components, use cases and the benefits of using it in your software development and deployment process.

What Is Docker?

Docker is a container software made by Docker Inc. This means Docker is not just a product but the name of the company. When people mention Docker, they’re more often referring to the product than the company.

Docker as a container software (more on containers later) allows IT teams which contain the Software Developers, System Administrators, Site Reliability Engineer and so on, work efficiently with software applications.

It lets the developers focus squarely on what they enjoy doing, which is writing code without having to worry about the platform the code will run on; then provides the system administrators with flexibility when managing the software in operation.

For the company, this is great because it makes it easier to push software quickly and saves cost by reducing the number of computer systems needed to run applications.

Docker ensures that the software development life cycle has fewer issues, by making it easier to create, deploy, run and manage applications through the use of containers.

Some of the issues the software development life cycle usually faces without Docker is the issue of stability. Applications are usually dependent on the configurations of the platform they run on, and this usually causes issues as configurations vary by platform.

Docker takes away this dependency on platforms, by using containers. So the application will always run when a Docker container is used, regardless the configurations of the platform.

It is amazing to know that Docker is open source, so you can pick it up and extend it when you need it to fulfill custom requirements.

Docker Components

Docker is made up of different components. Understanding these components will help you have a better picture of how Docker works and how to use it.

These components are the:

  • Docker Engine
  • Docker Image
  • Docker Container

Docker Engine

The Docker engine is a client-server application. This means it has a client-side and a server-side which both interact with each other using an API (Application Programming Interface).

The three main parts of the Docker engine are the:

  • Docker CLI
  • Docker REST API
  • Docker server/Daemon

There is a huge misconception that the Docker engine and the Docker daemon are totally different things. But the Docker daemon is instead a part of the entire Docker engine.

Let’s take a look at the different parts of the Docker engine.

Docker Command-line Interface

A Command-line Interface (CLI) is a tool that provides the user access to software. In this case, the Docker CLI is used to access the Docker software; the Docker software in this case is the Docker server/Daemon.

There are a lot of Docker CLI commands, you can use to work with Docker. You can use them for Docker administration tasks such as building or deleting a Docker image, running or stopping a Docker container.

Docker REST API

An API is an Application Programming Interface for interacting with another service. While REST is a pattern for making APIs.

In this case, the Docker REST API interacts with the Docker daemon, by receiving commands from the Docker CLI.

So the Docker CLI has to communicate with the Docker daemon, and it does this using the Docker REST API.

Docker Server/Daemon

The Docker server runs programs known as Daemon processes. Reason why it is referred to as the Docker server or Docker daemon.

It is like the brain behind the whole process and runs on the host operating system. It listens to the Docker REST API, waits for commands from the Docker CLI and executes the actions for the commands received.

Since it runs on the host operating system, it is able to interact with the operating system. So when you run the commands for running or stopping a Docker container, building or deleting a Docker image and so on, it is the Docker server that runs the process. This is after receiving the message from the Docker CLI through the Docker REST API.

Due to misconceptions, when people say “Docker,” they could be referring to the Docker engine; when they say “Docker engine,” they could be referring to the Docker server.

Docker Image

Images are compressed files, which serve as a self-contained piece of software. A Docker image isn’t any different.

A Docker image is an immutable file, which is a combination of a filesystem and parameters such as system libraries and tools, created in different layers. The layers that make up a Docker image are determined by a Dockerfile.

According to the Dockerfile reference:

A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image.

There are specific commands that you’ll use to create a Docker image from a Dockerfile, but it’s beyond the scope of this article. When you need to create a Dockerfile of your own, you can try understanding the Dockerfile.

A Docker image is built from a Dockerfile using the build command on the Docker CLI. After building a Docker image, it remains stateless. That is: It remembers nothing between invocations.

To build a Docker image, you can choose to either build off an existing base image or build your own base image. But you’ll often find out that using an existing base image is good enough.

As long as you do not make changes to the Dockerfile, the Docker image will produce the same result when built. This is one reason why many techies like Docker.

Though when a change is made, Docker simply rewrites the changed layer or adds the new layer when you build again.

Using Object Oriented Programming (OOP) terms, you can consider the Dockerfile to be a class or the blueprint for an object.

Find out what the object is in the next section.

Docker Container

A Docker container is a standard unit of software created in an encapsulated environment, different from the main or host operating system’s environment.

It is lightweight when compared to the host operating system and doesn’t contain unnecessary files or packages.

Since it is based on the Docker image, it will only have the code and dependencies needed for the software in the container to run as seen in the Docker image.

While the Docker container’s configuration is determined by the Docker image, other configurations can also be added when starting the container.

It should be noted that the Docker image doesn’t keep states, but the Docker container keeps a filesystem.

When you start a Docker container from a Docker image, a small container layer is created on the Docker image where newly created or modified files are stored. But these changes will only reflect on the Docker container where the changes were made, so when it is restarted the changes remain—except you use the –rm argument on the docker run command.

If a new Docker container is created from a Docker image, it starts afresh and none of the changes made in one Docker container will reflect in another.

To create a new Docker container, the run command is used; to restart a closed container, the start command is used.

Using Object Oriented Programming (OOP) terms as you saw in the Docker image section, the Docker container can be regarded as an object. While the Docker image can be regarded as a class.

The same way you can create multiple objects with a class, you can also create multiple Docker containers with a Docker image.

The Relationship

The Docker engine (CLI, REST API and Server), Docker image, Docker container all have relationships. You use the Docker CLI to interact with the Docker server through the Docker REST API. The CLI + REST API + Server combination can then be used to create the Docker image through a Dockerfile, and the Docker image can be used to create a Docker container.

Why Use Docker?

There are a lot of reasons to use Docker for your software development projects as a software developer or for managing packages as a system administrator. In this section, you’ll see what these reasons are.

You’ll learn about the:

  • Problem Being Solved
  • Benefits
  • Limitations

Problem Being Solved

Docker is being used to solve a lot of problems, especially during the software development life cycle. In this section you’ll see how Docker helps solve the following problems:

  • Heavyweight Virtualization
  • Environment Inconsistency
Heavyweight Virtualization

Virtualization can play a huge role in increasing the overall performance of an application. Much as virtual machines look like the go-to solution, this is not always the case.

virtual machines let you run multiple operating systems on your host computer; but virtual machines are large, usually as large as the host operating system itself.

This is because operating systems in virtual machines are a full package, but Docker only contains the files and dependencies needed for a process to run. Hence, Docker is lightweight.

Environment Inconsistency

Have you ever heard of the “it works on my machine” problem? Where software works on the developer’s computer, but breaks when deployed to production. The developer’s machine has different packages or dependencies which are suitable for the application. Unfortunately, it breaks when moved to production environment due to the lack of similar packages or dependencies.

With Docker, all that is needed is for the production environment to be capable of running a Docker container. Since a Docker container is an encapsulated environment, all of the dependencies can be installed there through the Docker image.

This way, the application can run successfully on any machine. Since the encapsulated environment (the Docker container) is the same everywhere.

Benefits

Asides helping solve certain problems, Docker also has certain benefits that are side effects of the solution it provides.

Docker is beneficial in the following areas:

  • Speed
  • Size
  • Scalability
  • Architecture
  • Costs
  • Deployment
  • Security
Speed

Docker has very small startup time. A Docker container is not an operating system with full packages, so it can be started up in a couple of seconds. With virtual machines, it can take minutes.

Size

The Docker container is lightweight, hence it only takes up little storage. This allows you create a lot of Docker containers.

Scalability

With Docker, you can easily scale out your architecture. You can create many Docker containers and scale out or even assign more volume to the Docker containers. You’re in total control.

Architecture

Docker containers make it easy to build a microservices architecture. This means software can be broken down into smaller parts known as microservices, making it easier to maintain and deploy.

Costs

Docker encourages a better architecture, loads up fast and makes scalability easy, it saves costs and helps a company make more revenue. Lesser infrastructure requirements translates to lesser costs.

Deployment

A Docker container provides an encapsulated environment everywhere, it is easier to build, test and deploy applications. Its speed also contributes to making the deployment process a rapid one.

Security

The isolated nature of Docker containers contributes positively to the level of security of an application. The Docker container is an encapsulated environment, so no Docker container can see the processes running in another Docker container.

Limitations

Docker is a powerful tool. But you can’t use Docker always as it is not the right tool in certain conditions. As with everything, it has some downsides and limitations.

Docker has limitations in the following areas:

  • Persistent data storage: Docker containers are stateless, the filesystem is lost when the Docker container is destroyed. Albeit a problem, Docker volumes exist to overcome this issue.
  • Graphical User Interface applications: Docker was built for server applications, so it isn’t suitable for Graphical User Interface applications.
  • Monitoring: You can create many Docker containers which is a positive, but introduces the problem of managing and monitoring the containers.
  • Linux dependent: Docker is Linux dependent. To run on other platforms, a virtual machine is used; though it’s hidden.

A Real-life Metaphor

The explanation on the concept of Docker should give you an understanding of how it works; still, a metaphor may provide a better point of view.

The metaphor in this case is not perfect, but it should do the job. Consider a human being (a woman) to be an operating system, which can run a Docker container (a baby). The baby in the mother’s womb is in an encapsulated environment.

Like an operating system can interact with a Docker container through ports, the mother can interact with the baby or foetus at the early stages through the placenta. As said earlier, you can run multiple Docker containers on the host operating system, a mother can also have multiple babies in the womb (except that they are in the same isolated environment).

The difference here is that Docker containers are in isolated environments of their own, and two Docker containers do not share one isolated environment.

Also, the baby is only running certain processes in the womb—remember the Docker daemon? It cannot be considered to be a full human (a complete operating system) yet, at least until it’s birthed.

The metaphor is not perfect, but hopefully it has made things clearer.

Docker Vs virtual machine

Docker is often considered to be an alternative to the virtual machine, or the other way round. But this is not completely true.

Here’s why:

On one hand, a virtual machine is a full operating system whose operations are run by the hypervisor; on the other hand, Docker is not a full operating system and has its processes run by the Docker server.

You can run multiple virtual machines on your host operating system, just like you can run multiple Docker containers. But a virtual machine takes way more space since it is a full resource.

Hence, a virtual machine is an entire OS and Docker is just an isolated process in an encapsulated environment. You can run a Docker container in a virtual machine, but not the other way round.

From this, you can tell that both serve different purposes. Docker is the way to go, if you want to run isolated processes without spinning up a full operating system and coping with the overhead it creates.

When To Use Docker

Docker makes use of containerization. So you can build lots of containers and run them without interference between one another. At this point you’ll understand that Docker is not a virtual machine, instead it is an encapsulated environment for running processes.

There are certain cases where you’ll find Docker to be helpful, let’s take a brief look at some of them.

Here are some examples of cases where you need Docker:

  • Multiple applications on a server: You may need to run multiple applications on one server. Doing it without Docker can lead to dependency issues, where the applications may need different versions of a dependency. With Docker you’ll be able to create isolated environments and install the dependencies without issues.
  • Smooth deployment: You can have different containers for different stages of your application such as development, testing and production. This allows you deploy to each of those containers when needed, making the deployment process straightforward.

Conclusion

Hopefully, this article helps you make the best choice on using Docker. Now you should understand what Docker really is, the parts that make up the entire system, the reasons to use it and be able to differentiate between a virtual machine and a Docker container.

DevOps methodology is being used at a lot of organizations today and Docker is at the center of the continuous deployment phase of the methodology. In this stage, Docker makes it easy for software to be moved into production environment effortlessly.

Docker makes deployment easier, giving people more time to focus on things that really matter to them.

If you are interested in going deeper into the world of Docker, I can recommend this course: Docker Mastery: The Complete Toolset From a Docker Captain. This is a hands-on course where you learn to create your own Docker files and publishing your own images.

Similar Posts