Docker- The art of Containment


We often say that some technology is revolutionizing the way our industries work. Only in this case revolution came in a container. Let’s take a plunge into this technology of containerization. But before that, I would give a brief history of what led to this technology in the first place. I will start by explaining about microservices. What are microservices? The basic idea behind these kinds of the structure is such that if we can split an application into small and composable pieces that can work together. This will not only simplify the development process but would also make it easier to monitor and insert changes in the section specifically. Even if a constituent fails to respond, it wouldn’t completely burn off the application. This microservice architecture has a host machine above which VMs or virtual machines are mounted. Each VM is responsible for the dependency of one microservice.

Why Containers?

There is a problem here though. A lot of resources like processor, RAM, or disk space gets wasted in managing these VMs. So, bigger the application, the more the VMs and greater the loss, this goes to a point of being unreasonable. This is where we tend to switch for containers. Containers are light- weighted substitutes for VMs. They do not require you to allocate any RAM or disk space separately, it takes up resources according to its need. The architecture of such a containerized platform is- these are mounted on a VM that is again on top of a host machine.

But why do you need this VM when you are putting the application in containers? Well, this has got two good reasons, firstly Docker is exclusively for Linux. Therefore, to use it on Windows requires a virtual Linux OS. Also, this encapsulates the entire application from the host machine.

Docker Containers

Now, you know what are containers, and what led to them. Let us explore these Docker containers further. You would agree “it works fine on my machine” is the ultimate problem statement of any software development. Docker manages to address that issue, quite efficiently. The heart of the Docker is its containers, and as we know these are encapsulated from the host machine. Hence, no latent dependencies from the host machine to bug on test or production environment. It’s basically like an airtight container being shipped to places, no matter what is the environment, functionality is not compromised.

Docker Containers and Its Structure

I will take you through the structure of a Docker container next. The structure commences with the host OS above which we mount a Docker Engine. It is the heart of the Docker containers; it acts as a client-server application. Docker Engine forms containers that have applications running within. You should look closer on a Docker container, in order to do lets first get familiar with Docker Images. These can be considered as the building blocks of a container. It is more like a template used to create a Docker container. Once the image is built, it is a read-only template that creates container.

Next is the Docker container, so what is that? Docker container is a ready application created from Docker Image. You may say it a running instant from an image holding the entire setup to run an application, the ultimate purpose of the Docker.

While exploring Docker images it suddenly occurred to me, so where are these images stored. I mean there should be someplace, or cloud where these are uploaded to be extracted later. Yes! Such a provision does exist which is Docker Registry. Docker Registry is where the Docker Images are stored. The Registry can be either a user’s local repository or a public repository. Talking about which brings us to Docker Hub. It is the public repository that I mentioned earlier. Docker Hub allows multiple users to collaborate in building an application.

Docker Networking and Architecture

Docker Networking is another aspect that explains how Dockers communicate. When individual containers, communicate with each other through a network to perform the required actions, and this is nothing but Docker Networking. Docker Networking can be defined as the communication arena through which all the airtight containers communicate with each other in various environments.

The architecture of Docker containers include Docker Registry, Docker Client and Docker Host. We already know about the Registry, therefore, moving on to Docker Client. It triggers the Docker command. Docker Host basically responsible for running the spirit of the Docker, rather known as Docker Daemon. Docker Daemon holds the Docker Images and the Docker containers.

Docker Container Functionality

Since you are now familiar with the Docker components and architecture. Let’s have a look through the functionality of Docker.

  • Docker Client will issue a build command, Docker File that is a code written by developers to define and state the dependencies of an application.
  • Once Docker File reaches Docker Daemon on Docker Host, it builds an image. This image is based on the codes developed. This image is stored or uploaded on the Registry, maybe public or private repository. It is sometimes possible that QAs or production teams do not create an image, they just pull an image from the public repository. That would be Docker Hub, Docker’s very own cloud.
  • Finally, the Client will hit a run command, that will create a running instance of your Docker Image. This will create a Docker container.


There are a few points that will keep reminding you, why you need Docker.

  • Data containerization vouches for consistency in all levels of an application and Docker is not an exception.
  • Docker’s platform allows for highly portable workloads. Docker containers can run on a developer’s local laptop, on physical or virtual machines in a data center, on cloud providers, or in a mixture of environments.
  • Running more workloads on the same hardware.
  • It lets you, your company, or as a matter of fact anybody to share a software through images.
  • Last but not least remember, a Docker container can always run its software. The system is immaterial here.

With so many options in the market, even a simple Docker course can leave you confused to the bits. If you are in such a situation, or even if you are not, you can reach out to Nuvepro. Here, we provide hands-on labs for best in the class learning experience. We also assist in deploying these tools in an organization. Please reach out, no matter at what level you are; beginner or intermediate or even advanced. We will be happy to help.


For Enterprises

Upskill and reskill your employees and make them ready for new projects.

For Online Learning Providers

Enhance the learning experience and learner stickiness

For Educational Institutions

Enable students to rapidly get skilled and be industry ready.

See a real-life use case that is relevant to your organization