Today, Docker is so popular because it has enabled easy access and usage of several features that were already present in Linux kernel, but were rarely utilised when it came to application security and isolation.
Moreover, Docker has abstracted several layers of the Linux kernel and added some components on top of them that them more user friendly.
Containers were not designed as a replacement technology for virtual machines. In fact, they were designed to focus on one single application at a time, inside each and every container.
In fact, containers are a streamlined way to build, test, deploy and redeploy applications on multiple environments - from a developer’s local laptop to an on-premises data center and even to the cloud.
Find out what are containers and what are their benefits here.
A Docker Container is meant to provide a standardized software environment for running apps, which will behave in the same way wherever it is deployed.
Containers, therefore, bring the following benefits to DevOps:
4. Ease of distribution
5. Ease of management at scale
Read more about container technology with Docker.
As you might know, virtual machines contain entire operating systems, apps and files and run on top of a hypervisor that allocates virtual resources (CPU, RAM, etc.) to them from the physical infrastructure that hosts it. A virtual machine can therefore be managed as a standalone server.
Conversely, containers do not encapsulate an operating system (OS) – instead, they run on a single operating system, directly from the physical infrastructure, and rely on kernel level mechanisms from that OS to ensure app and resource compartmentalization and security.
Read more about the difference between containers and virtual machines (VMs), here.
A Docker image is just a minimal set of files on your disk. It includes code, some libraries, some dependencies, everything that you require to run a very specific application.
Moreover, Docker images have layers (just like onions!) and each layer usually adds more content to the image and builds upon the previous one. All these layers in an image are read-only in order to preserve the integrity of the base image.
Find out more about Docker images and the Layered Filesystems.
DevOps as a movement aims to bring together two traditionally completely separate types of teams: the developers who write the applications and the operations people.
Developers now need to understand how a cloud infrastructure works, so that they can package their applications in a more compatible manner. Also, on the other hand, operations teams need to understand that they are not simply providing the infrastructure, but that their task is now to run real code, real applications, that they should be aware of.
Read more about the role of Docker and how it fits into the DevOps ecosystem, here.