The Docker container virtualisation technology has proved to be such a hit with its users that Amazon Web Services has created a new management tool for handling large scale Docker deployments, though observers worry it could lock customers more tightly into the Amazon ecosystem.
Yesterday, Amazon Web Services debuted the EC2 Container Service, now available as a preview, which provides a way for AWS users to easily deploy and manage up to hundreds of thousands of Docker containers.
The EC2 Container Service is "a highly-scalable, high-performance, container management service," said Amazon Chief Technology Officer Werner Vogels, who introduced the EC2 Container Service at the company's annual Re:Invent user conference this week in Las Vegas.
A Docker container can be used to package applications so they can be easily moved across different servers. Introduced last year, the technology has already proved to be an immense success - the software has thus far been downloaded over 50 million times, according to the company Docker, which oversees the open source software of the same name.
"There are a lot of people who like to use the Docker container model, and it is becoming more popular to run Docker applications on AWS," said Ariel Kelman, AWS head of worldwide marketing. "So it was pretty natural for us to give them some better tools for automation."
Until now, customers had to write their own scripts to coordinate Docker-based operations on AWS.
The EC2 Container Service is provided by Amazon free of charge. It provides a set of APIs (application programming interfaces) for deploying a fleet of containers, as well as coordinating their operations with other AWS services, such as CloudWatch monitoring, Elastic Load Balancing, Identity and Access Management.
The service allows administrators to start and terminate large clusters of containers. It can automatically assign the most appropriate Amazon EC2 (Elastic Cloud Compute) virtual machine (VM) to run the container on. The software can ensure the containers run in different availability zones, for maximum reliability. It can schedule deployments of containers, or work with other Docker schedulers, such as Mesos.
The service could be particularly useful for managing distributed applications, packaged in multiple Docker containers, that run on multiple EC2 VMs.
Different parts of a distributed application, each running in its own container, may have different requirements--one container may need a VM with more memory, while another may require more computational muscle. The EC2 Container Service can assign each container to the most appropriate EC2 VM. As a workload increases, the service can be scripted to add more VMs to the job, or reduce the number of VMs should the workload lighten.
"We're giving the developers and operations professionals fine grained control over the environment but in an automated tool so they don't have to manually manage everything, instance by instance," Kelman said.
Users should be careful of growing too reliant on all of these advanced features, lest they tie their workloads too closely to AWS, warned Bob Quillin, CEO of StackEngine in an email. StackEngine offers its own software for managing Docker containers.
With EC2 Container Service, AWS is one of growing number of cloud providers offering tools for managing containers, joining competitors such as Google, Digital Ocean and Rackspace. Offerings from such companies can be "lightly-veiled attempts to pull developers into their cloud services," Quillin wrote. These tools, once incorporated into an application's workflow, can make it difficult to move that workflow to another cloud provider, or to an on-premise operation, should the need arise.
The EC2 Container Service is "very Amazon specific," said Alex Polvi, CEO of CoreOS, which offers a Linux distribution optimized for running Docker containers. The company plans to support the EC2 Container Service so "CoreOS runs out of the box," with the EC2 Container Service, Polvi said.
CoreOS itself supports another technology for managing containers, the open source Kubernetes project, which was started by Google to manage Docker deployments on its own cloud.
"One of the nice benefits of an open source project like Kubernetes is that you can run it wherever you want. You can make it work on EC2. You can make it work on Google. Or you can make it work on bare metal servers back at home," Polvi said.