Docker has appeared seemingly out of nowhere, not quite fully formed, but interesting enough to gather around it a massive ecosystem of supporting technology companies, millions of fans and huge dollops of venture capital. For those who have missed the buzz, it makes containerisation of software a doddle, with the result it can be deployed in seconds on any supporting platform.

Unlike traditional virtualised applications, each of which requires a separate copy of the operating system, those in Docker containers don't. Each container holds the application and its dependencies and many of them can sit on top of the Docker Engine which, in turn, sits on top of the operating system (Linux for now). Each container runs one primary application, but there's nothing to stop this being a supervisor for multiple processes. The primary benefits are the saving of compute resources and of development and operational time because each container runs unchanged on multiple computing platforms right down to bare metal.

Everyone who's anyone wants to join the party. At a mere 15 months old Docker ran a well-supported two-day DockerCon conference in San Francisco in June this year. Platinum sponsors were IBM, Rackspace and Red Hat. To give an idea of the acceleration of interest in Docker, the installation package had been downloaded 21 million times by September, up from a base of three million in June.

Competitors and complementary tools vendors alike queue to be touched by the Docker magic. Microsoft's Azure cloud system already supports Docker containers but the two companies have entered into a collaborative development agreement to implement the Docker Engine in Windows Server in 2015. The result will be containers for Windows apps, thus more or less doubling Docker's reach.

So what is Docker exactly?

For a start, the Docker project is open source, which is one of the reasons it has garnered so much support. Many developers are happy to share their Docker application images in a public registry (called Docker Hub) from which anyone can access them and re-use them or include them in containers along with their own images. All manner of images are tested and ready to use, including mySQL, ubuntu and WordPress, for example. Docker's official "standard library" is also stored there. However, many commercial organisations are unwilling to share what they consider to be their valuable IP and that's okay too. Docker has open-sourced the Hub code so that they can keep these images in their own private registries.

If you're of a (slightly) technical bent, you can create a Docker image quickly, by clicking Docker's website 'Try it!' tab. It will give you an insight to what is possible with Docker, albeit on a very small scale. You'll download an image from the Hub, run it, extend it with another image, and then publish the result. This reveals the potential speed of application assembly by letting you mix your own code with prewritten code simply and quickly. You can save the build instructions as an editable Dockerfile which can be used as a macro for speedy future builds. Once your container is ready, it can be deployed wherever you need it: in data centre virtual machines; on your own servers or laptops; and in various types of cloud platform.

Docker containers are fast-loading and take up a fraction of the space of their traditional VM equivalents. They can be activated and de-activated at the drop of a hat – seconds instead of possibly minutes in virtual environments. This all adds to the value of using Docker; if containers can be de-activated that easily, it quickly releases space for more applications to run. In effect, Docker packs more applications into any given time slot as well as packing more applications into any given space. Developers are able to roll out test containers, run them safely, and dispose of them immediately. Upgrades could be implemented by running new container images in parallel with their predecessors, switching off the old ones when it's safe to do so.

If you are a Linux shop with distributed micro-service type applications, Docker might look very attractive. The truth is, as we hinted at the start, that Docker still needs to add more to its native capabilities, but any shortcomings are either being addressed internally or through the massive and still growing developer ecosystem.


In a fast-moving world, it's probably unwise to dwell unduly on Docker's shortcomings because they're likely to be resolved fairly quickly. In any event, many things can be achieved using a combination of third party tools and services. However, Docker still wants to make life easier for developers by providing all of the essential features itself.

A good way to see the latest concerns is to look at the road-map discussions of the Advisory Board meetings. The resulting 'Docker Project Statement of Direction' is published online, so anyone can take a look and see what's on the collective Docker community mind. The members of this board are:

  • the internal maintainers of the project, including company founder and chief maintainer, Solomon Hykes;
  • the four top contributors of non-trivial repository images;
  • four companies that met strict collaboration criteria – Google, IBM, Rackspace and Red Hat;
  • and four users, elected by votes from significant contributors from a shortlist –Atlassian, eBay, Spotify and Tutum.

Here are the main considerations of the October Advisory Board meeting (paraphrased to provide a sense rather than exhaust you with detail):

  • Orchestration of multi-container applications and their interactions;
  • Networking beyond containers residing on the same host;
  • Allow stateful containers to migrate dynamically between hosts for greater resilience, load-balancing etc.;
  • Microsoft Server developments (mentioned earlier);
  • Provenance – digital verification of third party images;
  • A plug-in API which will facilitate many of the issues raised; and
  • Expand architecture support, including to ARM, Joyent SmartOS, and Microsoft.

[Next page - Ecosystem players and What's in it for you?]

Ecosystem players

Apart from Microsoft, plenty of ecosystem players are bringing something to the Docker party.

A picture will serve better than any number of words to illustrate the partner ecosystem that surrounds Docker. The one below was published by Docker in June:

You will see many familiar names there (even a few duplicates, but in different contexts). However, further big names have either signed up or deepened their commitment since, including Amazon Web Services' EC2 Container Service, Dell's Cloud Marketplace beta program and VMware. Other big names, such as IBM, have also stated their support of, and commitment to, Docker technologies.

Docker container partner ecosystem

Google is probably worth a special mention. It launches something like two billion containers a week. Internally, it uses its own container orchestration and management system called Omega. However, for the Docker community, the company has open-sourced Kubernetes which it describes as, "a lean yet powerful container manager that deploys containers into a fleet of machines, provides health management and replication capabilities, and makes it easy for containers to connect to one another and the outside world". More recently it announced the alpha release of Google Container Engine that takes care of the deployment of Docker containers to logical computing clusters.

Docker has acquired two companies – a British company called Orchard and an American one called Koality – to strengthen the core product and bring more development talent in-house. Apart from its talent, which is now in charge of the 'developer experience' (DX), Orchard's main contribution was its Fig tool for creating and distributing isolated development environments using Docker. Koality, on the other hand, has migrated its technology and talent into the Hub Enterprise which, Docker says, "will allow enterprise IT teams working on distributed applications to collaborate on their modular components, while maintaining them in a private software repository behind their firewall".

A glance through the programme for the upcoming DockerCon Europe shows that none of the three Platinum sponsors is on the June partner ecosystem chart, yet they presumably have an interest in seeing the event succeed. They are Atlassian (software development and collaboration tools); Intel, which needs no introduction; and ING, the largest online bank in Europe and the Netherlands' biggest employer of IT professionals. Perhaps part of the attraction is that they will secure the interest of the event's high calibre and influential attendees.

Gold sponsors include IBM, AWS, Microsoft and VMware, all organisations that must have thought long and hard before choosing whether to do battle with Docker or join forces. The trouble is, it's very hard for an established commercial organisation to fight a social/open source movement and it would be just about impossible for any one of them to have created a community like Docker has done. They all know that Docker has created something special and perhaps they, and other prominent members of the ecosystem, are hoping to position themselves favourably regardless of whether Docker succeeds or stumbles in the long term.

What's in it for you?

Docker and its ecosystem are heralding a change in the way that applications are developed and deployed to the cloud. It is well-supported by all the major players and, before long, it will spread beyond its natural home, the Linux world. It brings great potential advantages to the CIO through more efficient use of compute resources, simpler and more reliable deployments, more agile application development and testing and, in all probability, a reduction in the number of meetings that centre on technology and security issues. Once a system has been established for containerising applications, they can be deployed to any Docker-supported environment with the peace of mind that comes with knowing they will just work.