Skip to main content

Moving to containers and serverless computing – are you keeping your application data secure?

(Image credit: Image source: Shutterstock/Carlos Amarillo)

Today’s leading enterprises are constantly striving for better performance. Their IT teams are implementing highly scalable and always-on services, and more of these teams are turning to new software architectures and tools like containers and serverless computing to achieve these goals. In short, it’s a great time to be a developer.

However, while this is leading to more agile and reliable software, the ephemeral nature of these modern applications means traditional monitoring, troubleshooting and security management solutions can often fall short. In turn, this can potentially leave sensitive data vulnerable.

The rise and rise of containers

First, it’s important to look at why containers are being adopted. The push for better business agility has given rise to new computing approaches and cloud technologies in recent years. Rather than traditional applications that are made up of large sections, modern applications are more often based on microservices - this involves using multiple elements that are linked through application programming interfaces, or APIs.

Microservices can scale up more rapidly than traditional applications as they can have more resources provided to each element as needed. Containers support this design approach very well. In computing terms, a container is a standard unit of software that packages up code and all its dependencies into one new format. Each application should run more efficiently, regardless of the environment used to host the container.

These files are hosted as images within groups of containers called registries and these images can either be public versions created for companies or individuals to call on and use as they want, or private images that are created to be used internally. Each container image is implemented at runtime, and will then run until it is closed.

For applications that use a microservices architecture that is made up of multiple individual elements, containers are a good fit. When each container can be created and run as needed, more compute power is available as additional container images can be created and added to the application cluster to meet demand. When the application service is no longer needed, it can be scaled back or stopped, freeing up resources for other applications.

Container deployments bring numerous advantages such as:

  • Scalability - Containers can be added or removed very quickly from an environment
  • Portability - Container hosted environments remain consistent, regardless of the operating system or cloud service used to host them
  • Speed - Containers can start and stop much faster than virtual machines can
  • Agility - Containers can be used to break up large monolithic environments into modular microservices, making ongoing development and testing much faster

Containers are proving popular today, with Docker as the most popular container technology. In fact, a recent study we conducted found that 28 per cent of enterprises using Amazon Web Services (AWS) cloud infrastructures now use Docker as a critical foundation layer for their applications, up from 24 per cent in 2017 and 18 per cent in 2016. This rise in popularity is largely attributed to Docker’s open source nature and fortunate launch timing, which coincided with both the DevOps revolution and the decline of virtualisation as the ‘go to’ technology amongst many enterprises.

These container-based environments do require close management if their full potential is to be realised, which is why there has also been significant growth of orchestration technologies such as Kubernetes and Amazon’s Elastic Container Service (ECS) in recent years.

Orchestration technologies automate the deployment and scaling of containers to ensure the reliability of applications and workloads running within them. Our study found one in three AWS customers is now using orchestration for container management, with a growing number also considering it as a way to deploy and manage multi-cloud applications.

Serverless computing set to be the ‘next big thing’

If containers are the star technology of today, serverless computing looks set to be the star of tomorrow. Serverless computing allows users to build and run applications without thinking about infrastructure at all. No provisioning, scaling or management is required, meaning developers can simply focus on what their software does rather than how to host or run those applications.

Despite still being in its relative infancy, adoption of serverless technology such as AWS Lambda is rising dramatically. Our study found Lambda production usage in AWS currently stands at 29 per cent, up from 24 per cent in 2017 and just 12 per cent in 2016. This growth is impressive, and is due to many initial use cases that made use of DevOps deployment and automation processes.

In practice, serverless deployments help developers deliver software faster without requiring knowledge of how those cloud services scale up. This emphasis on results can help developer teams meet business goals faster; however, these deployments do need monitoring for cost consumption and management. This is essential if you want to avoid higher than expected bills from your cloud provider.

Cloud-based applications require a new approach to security and monitoring

Containers and serverless can provide developers with faster and more efficient ways to get their software up and running. However, because these systems are new and different to traditional IT, they need different methods for monitoring and managing existing applications. Tracking application performance has to take into account how individual containers are running, how the overall application component is performing, and how all the underlying componentry coalesces to provide the service that the business needs.

With containers being created, running and removed in response to demand levels, getting accurate data on performance means bringing together log and metric data and application metrics. This log data has to be sorted, analysed and brought together in order to help developers see issues and performance in context. This is particularly important for meeting operational and security expectations from customers. 

Security for new container-based applications and serverless computing deployments involves getting deeper, real-time insights into what is taking place within an application that can be made up of tens or hundreds of individual instances. With so many moving parts involved, getting good data is essential to help security teams understand what is going on within an application. Containers were not inherently architected with security in mind, meaning they present unique security risks such as privilege escalation. This means that a threat in one container image can quickly compromise data used by other containers.

Similarly, vulnerabilities can be created using insecure or unvalidated container images. Traditional security tools would struggle to mitigate these risks. However, the fast, real-time visibility that cloud-based security provides allows for much faster identification and resolution of issues like these. Furthermore, the day-one integration of security features that cloud-based solutions offer prevent many such breaches occurring in the first place.

Embedding new monitoring and measurement tools into your containers or into your overall serverless infrastructure is essential to get this information through on how the application is performing. Without these tools, it is hard to get insight into what is taking place and how containers are getting created, used and closed when they are no longer needed.

By using your logging and metric data more efficiently, you can get better insight into how your services are performing. This data is also essential if you want to maintain compliance. Similarly, this data is necessary for threat investigation – with all your application containers creating their own log data, sorting and searching through the massive amount of security log and metric data to investigate incidents can be difficult if you are not prepared for it.

As the popularity of cloud-based applications continues to grow, adoption of cloud-based security approaches needs to follow suit. Traditional security solutions simply can’t cope with the new and unique demands that cloud-based applications put on them. As you decide to move more applications over to containers, tracking the security and performance of these applications relies on your ability to get good and accurate data.

Colin Fernandes, EMEA Product Marketing Director, Sumo Logic
Image source: Shutterstock/Carlos Amarillo

Colin Fernandes
Colin Fernandes is director of product marketing EMEA at Sumo Logic. He leads the company’s education and marketing campaigns in Europe, helping companies understand the challenges and opportunities around modern application design and implementation at scale. Prior to Sumo Logic, Colin worked at VMware where he was responsible for the company’s operations around the telecoms and cloud management sectors.