The Docker application container platform has been touted as one of the hottest technologies out there today, and a scan of recent news supports that claim. Netflix, which has released a lot of its production code as open source, announced recently that it will package most of its open source projects in Docker format. Netflix is in good company — a recent survey by Datadog found that real-world Docker usage has quintupled over the past year.
Docker is popular because it makes it easy to move applications among various systems without modifying any code. It packages an app and all the resources it needs in a lightweight and portable bundle known as a container or micro-service. Traditional virtualization requires a hypervisor to abstract resource management from hardware. By using containers instead of a hypervisor to isolate resources, Docker enables distributed applications to be built and run on a shared operating system.
In addition to making applications more portable, Docker containers enable greater server density and higher resource utilization. At the same time, servers don’t have to run a new copy of the operating system for each virtual machine (VM), which can reduce virtualization costs. Sharing one operating system also simplifies management and enables applications to be deployed automatically.
While Docker is an open source project, the recent launch of the Open Container Project (OCP) is expected to accelerate development of open industry standards for application container formats. Standardization would accelerate adoption of application container technology and enable vendors to more quickly develop tools that support the standards. The OCP is being overseen by the Linux Foundation and sponsored by industry leaders including Google, HP, IBM, Microsoft and VMware.
Because Docker has become so popular so quickly, some have suggested that organizations should abandon VMs in favor of application containers in order to maximize of the cost benefits. However, organizations should avoid getting caught up in the hype and realize that both VMs and containers will likely be needed for the foreseeable future.
It’s also important to be aware of some of the challenges associated with Docker containers. Gartner analysts have warned that Docker security is insufficient for today’s threat environment. And while Docker adoption is still in its early stages, some organizations have already found it difficult to scale their container architectures.
In addition, Docker has limited operational tools and support. Programmers have had to write custom scripts to deploy and manage containers, a time-consuming process that in some ways defeats the speed and agility containers are meant to provide. Just last month, however, Docker announced the purchase of Tutum, a popular cloud service that enables the provisioning, deployment and management of Docker containers. The purchase is expected to relieve some of Docker’s operational challenges.
So what can Docker containers bring to your data center? There are many benefits to be gained if the pitfalls can be avoided. Technologent has deep expertise in virtualization as well as the entire data center stack. Let us help you determine how Docker containers might best fit into your IT environment and strategy.
Tags:
Data StorageNovember 4, 2015
Comments