Dockers

Overview

If you’re in data centre or cloud IT circles, you’ve been hearing about containers in general and Docker in particular non-stop for a few years now. With the release of Docker 1.0 in June 2014, the buzz became a roar.

All the noise is happening because companies are adopting Docker at a remarkable rate.

Docker, an open-source technology, isn’t just the darling of Linux powers such as Red Hat and Canonical. Proprietary software companies such as Microsoft have also embraced Docker.

In a nutshell, here’s what Docker can do for you: It can get more applications running on the same hardware than other technologies; it makes it easy for developers to quickly create ready-to-run container applications; and it makes managing and deploying applications much easier. Put it all together Docker is riding the hype cycle as fast as anyone can recall ever seeing an enterprise technology go.

Moreover, for once the reality is living up to the hype. Frankly, there can’t be a single company of any size that’s not at least looking into moving their server applications to containers in general and Docker in specific.

What is Docker?

Docker is an open source software platform to create, deploy and manage virtualized application containers on a common operating system (OS), with an ecosystem of allied tools.

Who is Docker for?

Docker is a tool that is designed to benefit both developers and system administrators, making it a part of many DevOps (developers + operations) toolchains. For developers, it means that they can focus on writing code without worrying about the system that it will ultimately be running on. It also allows them to get a head start by using one of thousands of programs already designed to run in a Docker container as a part of their application. For operations staff, Docker gives flexibility and potentially reduces the number of systems needed because of its small footprint and lower overhead.

Features of Docker

  • Docker has the ability to reduce the size of development by providing a smaller footprint of the operating system via containers.
  • With containers, it becomes easier for teams across different units, such as development, QA and Operations to work seamlessly across applications.
  • You can deploy Docker containers anywhere, on any physical and virtual machines and even on the cloud.
  • Since Docker containers are pretty lightweight, they are very easily scalable.

5 Key Benefits of Docker

Continuous Deployment and Testing

Docker is gaining a lot of traction in the development and devops world for its consistency across environments. There are always minor differences between environments in development and release lifecycles, unless you have your own private repository environment with tight checks in place. These differences may be because of different package versions or dependencies. Nevertheless, Docker can address that gap by ensuring consistent environments from development to production. Docker containers are configured to maintain all configurations and dependencies internally. As a result, you can use the same container from development to production making sure there are no discrepancies or manual intervention.

With Docker containers, you can also ensure that developers don’t need an identical production environment set up. Instead, they can use their own system to run Docker containers on VirtualBox. The beauty of Docker is that you can run the same container on Amazon EC2 instances. If you need to perform an upgrade during a product’s release cycle, you can easily make the necessary changes to Docker containers, test them, and implement the same changes to your existing containers. This sort of flexibility is a key advantage of using Docker. Just like standard deployment and integration processes, Docker allows you to build, test and release images that can be deployed across multiple servers. Even if a new security patch is available, the process remains the same. You can apply the patch, test it and release it to production.

Multi-Cloud Platforms

One of Docker’s greatest benefits is portability. Over last few years, all major cloud computing providers, including Amazon Web Services (AWS) and Google Compute Platform (GCP), have embraced Docker’s availability and added individual support. Docker containers can be run inside an Amazon EC2 instance, Google Compute Engine instance, Rackspace server or VirtualBox, provided that the host OS supports Docker. If this is the case, a container running on an Amazon EC2 instance can easily be ported between environments, say to VirtualBox, achieving similar consistency and functionality. This grants you a level of abstraction from your infrastructure layer. In addition to AWS and GCP, Docker works very well with various other IaaS providers like Microsoft Azure, and OpenStack, and can be used with various configuration managers like Chef, Puppet, and Ansible, to name a few.

Environment Standardization and Version Control

As discussed above, Docker containers ensure consistency across multiple development and release cycles, standardizing your environment. On top of that, Docker containers work  just like GIT repositories, allowing you to commit changes to your Docker images and version control them. Suppose you perform a component upgrade that breaks your whole environment. It is very easy to rollback to a previous version of your Docker image. This whole process can be tested in a few minutes. When compared to VM backup and image creation processes, Docker is fast, allowing you to quickly make replications and achieve redundancy. Additionally, launching Docker images is as fast as running a machine process.

Isolation

Docker ensures your applications and resources are isolated and segregated. A few months back, Gartner published a report stating Docker containers are as good as VM hypervisors when it comes to isolating resources, but there is still work to be done in terms of management and administration.

Consider a scenario where you are running multiple applications on your VM. These applications can be team collaboration software (e.g., Confluence), issue tracking software (e.g., JIRA), centralized identity management systems (e.g., Crowd) and so on. Seeing as all of these applications run on different ports, you would have to leverage them on Apache and Nginx as a reverse proxy. So far, everything is in good shape, but as your environment moves forward, you will also need to configure a content management system (e.g., Alfresco) into your existing environment. Bear in mind that it requires a different version of Apache Tomcat, which will cause a problem. In order to fix this, you can either move your existing applications to another version of Tomcat or run your content management system (Alfresco) on your currently deployed version.

Fortunately, with Docker, you don’t have to do this. Docker makes sure each container has its own resources that are isolated from other containers. You can have various containers for separate applications running completely different stacks. Aside from this, effectively removing applications from your server is quite difficult and may cause conflicts with dependencies. However, Docker helps you ensure clean app removal since each application runs on its own container. If you no longer need an application, you can simply delete its container. It won’t leave any temporary or configuration files on your host OS.

On top of these benefits, Docker also ensures that each application only uses resources (CPU, memory and disk space) that have been assigned to them. A particular application won’t hog all of your available resources, which would normally lead to performance degradation or complete downtime for other applications.

Security

Docker is evolving at a fast pace, which Gartner even acknowledges, as mentioned above. From a security standpoint, Docker ensures that applications that are running on containers are completely segregated and isolated from each other, granting you complete control over traffic flow and management. No Docker container can look into processes running inside another container. From an architectural standpoint, each container gets its own set of resources ranging from processing to network stacks.

As a means of tightening security, Docker uses host OS sensitive mount points (e.g., ‘/proc’ and ‘/sys’) as read-only mount points and uses a copy-on-write filesystem to make sure containers can’t read each other’s data. It also limits system calls to your host OS and works well with SELinux and AppArmor. Additionally, Docker images that are available on Docker Hub are digitally signed to ensure authenticity. Since Docker containers are isolated and resources are limited, even if one of your applications is hacked, it won’t affect applications that are running on other Docker containers.

Docker’s Key use case

Here are just some of the use cases that provide a consistent environment at low overhead with the enabling technology of Docker.

1.Simplifying Configuration

The primary use case Docker advocates is simplifying configuration. One of the big advantages of VMs is the ability to run any platform with its own config on top of your infrastructure.

Docker provides this same capability without the overhead of a virtual machine. It lets you put your environment and configuration into code and deploy it. The same Docker configuration can also be used in a variety of environments. This decouples infrastructure requirements from the application environment.

The freedom to run your applications across multiple IaaS/PaaS without any extra tweaks is the ultimate dream that Docker can help you achieve.

2. Code Pipeline Management

The previous use case makes a large impact in managing the code pipeline. As the code travels from the developer’s machine to production, there are many different environments it has to go through to get there. Each of these may have minor differences along the way.

Docker provides a consistent environment for the application from dev through production, easing the code development and deployment pipeline.

The immutable nature of Docker images, and the ease with which they can be spun up, help you achieve zero change in application runtime environments across dev through production.

3. App Isolation

There may be many reasons for which you end up running multiple applications on the same machine. An example of this is the developer productivity flow described earlier. But there are other cases, too.

A couple of such cases to consider are server consolidation for decreasing cost or a gradual plan to separate a monolithic application into decoupled pieces.

Let’s say, for example, you need to run two REST API servers, both of which use flask. But, each of them uses a slightly different version of flask and other such dependencies. Running these API servers under different containers provides an easy way out through what we call the “dependency hell.”

4. Server Consolidation

Just like using VMs for consolidating multiple applications, the application isolation abilities of Docker allows consolidating multiple servers to save on cost. However, without the memory footprint of multiple OSes and the ability to share unused memory across the instances, Docker provides far denser server consolidation than you can get with VMs.

The new breed of highly customizable PAAS, such as Heroku, Elastic Beanstalk and App Engine, all use these powerful feature of containers that is now at your disposal with Docker.

Moreover, open source projects like Deis, Kubernetes, Cadvisor, Panamax, and others make deploying and monitoring large numbers of containers representing a multi-tier application architecture manageable.

5. Debugging Capabilities

Docker provides many tools that are not necessarily specific to containers, but, they work well with the concept of containers. They also provide extremely useful functionality. This includes the ability to checkpoint containers and container versions, as well as to diff two containers. This can be immensely useful in fixing an application

Flux7.com was run inside a Docker container. Our web developer told us that a crash resulted from a code change he’d pushed from the UI to the functions.php file. Within a minute, I was able to create a dev environment enabling the web developer to debug in a sandbox. When he gave us the green light a short time later, we were able to switch back to the latest version of the website, thanks to Docker and Linux Containers.

While the process was solvable using another strategy, using Docker was an efficient way to solve the problem. And, it’s one we have implemented in a number of customer deployments where front-end functionality is business-critical.

6. Multi-tenancy

Yet another interesting use case of Docker is its use in multi-tenant applications, thereby avoiding major application rewrites.

Privacy and Cookies

This website stores cookies on your computer which help us make the website work better for you.

Learn moreAccept and Close
Social media & sharing icons powered by UltimatelySocial