Docker has been hailed as a cycle-shrinking, cost-reducing panacea by DevOps, though ROI is far from proven with this nascent technology. So do companies have more than just hype or guesswork to go on?
Whilst businesses continue to embrace containerisation, uncertainties, and misconceptions about Docker linger, not least of which is that it always reduces costs and saves time. Here are the pros and cons of Docker, and what companies need to think more about before casting their vote of confidence and investing in this space.
What should companies consider before adopting Docker?
Docker is still in a stage of relative infancy and there are few proven use cases available today. With that in mind, enterprises should ensure the business case for adoption is clear – will Docker containers deliver cost efficiencies or improved processes for example? Next, they should think about the framework they have in place to effectively define, orchestrate and manage a container environment. Having that insight is vital in order to understand which containers are running, which containers talk to each other, and the processes that are needed for effective management.
What are the benefits and challenges of Docker?
Docker brings many advantages. There is no ‘full-fat’ VM requirement, removing the Hypervisor and guest OS resource overheads. Since the containers all run on the same operating system they make more efficient usage of CPU RAM and Disk. Docker is open source - it is able to run on all major Linux distributions and Microsoft operating systems and contains support for every infrastructure. It is also very scalable - Docker containers can be spun up or down in a matter of seconds, allowing for peak customer demand to be satisfied.
Not all applications require the benefits of scalability and fast deployment, of course. Take an Email solution like Exchange for example. Exchange should never require the benefits of rapid scalability and fast deployment, it is a solution that requires to be correctly sized and designed for longevity, with a gradual increase in resource requirements identified in the design. It is not an application whose demand fluctuates very quickly. But if you’re looking at the web front-end of an e-commerce solution, which has to rapidly scale to meet a surge in demand, Docker looks very promising.
There are also some downsides. First up is a potential of lack of true isolation. VMs currently provide high isolation, as the VM’s resources are all virtualised through the Hypervisor to the VMs. With Docker there is less isolation, as containers share an OS kernel and components making it much easier for issues such as malware or crashes to propagate from one container to another.
What types of applications are suitable for Docker containers?
While a Docker container is not that dissimilar from other container technology, it does have the advantage of being able to group key application components into a single container, meaning that there are three distinct types of applications that will thrive in this environment:
1. Applications that need to run on more than one cloud
2. Applications that use microservices
3. Applications that need to autoscale to deal with bursts in demand
Will Docker replace virtualisation?
The answer to this is probably no. Just as virtualisation has not replaced the need to buy physical servers, Docker will not replace the need for virtualisation. It is a technology that will be used more and more for the development of applications, and so it is certainly a technology that Ops teams will need to study, understand and be able to deploy within their environments.
What is the role of DevOps in Docker implementations?
DevOps is a way of thinking compared to a set of processes that are implemented in a certain way – with the end aim of improving the quality and speed in which innovation is delivered.
The advantages of Docker and its ability to streamline the packaging and delivery of applications, while improving collaboration between development and operations, fits perfectly with the DevOps mindset.
However, while development teams are excited about Docker and the new options it provides in the development and delivery of apps and software, operations teams still need clarity on how to look after the technology and unaware of the various issues it may pose. DevOps needs to bring these teams together to ensure Docker can address what it sets out to for an organisation.
How can IT departments prove Docker ROI for the wider business?
Docker is still a relatively new technology, so organisations do not yet have the experience or knowledge on how to introduce Docker to achieve business value.
The key drivers on business value are likely to be increased delivery speed and reduced change-based outages. These come about through Docker’s ability to keep the delivery component consistent from development right through to production, thereby minimising the chances of a mismatch somewhere along the line. The challenge will be to ensure that this advantage is not wiped out by increased operational costs due to the operational management challenges.
What is the risk of over-provisioning container estates and how can this be avoided?
With such a rapid speed of deployment, even compared to VMs, which require some configuration after they are switched on, it is important that when environments are scaled up rapidly they are also scaled back down, and retired containers are removed so that performance is not affected due to over-utilisation of underlying hardware. For an organisation to have a clear view of their IT estate and to understand when this needs to happen, they need to consider putting a capacity management solution in place.
What is the most effective way for monitoring and managing the Docker solution?
As with the introduction of VMware in 2005, the Docker solution is currently lacking in monitoring tools, making important tasks such as application performance monitoring very difficult to achieve. There is also a lack of capacity management tooling, making it hard to plan and manage these environments efficiently. The impact of this can and will lead to performance issues within an organisation as Docker starts to be used more heavily. However, the good news is that the Docker performance monitoring situation is improving, and at Sumerian, we are planning to be one of the first to offer full capacity management coverage.
Peter Duffy, CTO at Sumerian