Analysis / Contributed / Technology / Top Stories /

Security in the Modern Data Center

27 Feb 2018 8:55am, by

Nitzan Niv
Nitzan Niv is a 20-year veteran of the software industry. He is currently the system architect at Alcide, as well as leading its security research. Previously he worked for eight years at Imperva creating the internal web-application security research platform and contributing to various Web Application Firewall research projects and publications. He holds an M.Sc. in Computer Science, M.E. in Systems Engineering and B.Sc. in Computer Engineering from The Technion – Israel Institute of Technology.

In recent years there have been drastic changes in the way enterprise data centers are designed and implemented. Driven by business demands, data centers are adapted to the prevailing software architecture and DevOps methodologies and built using new technologies. The use of public and private clouds is on the rise, and virtualization technologies now play a major role in every organization’s infrastructure. These trends promise benefits like infrastructure scale and elasticity, a better match between applications’ software structure (distributed microservices) and deployment (containers and virtual machines), operational costs savings and fast development and release cycles.

At the same time, securing the data center has never been more important. Highly-publicized data breaches and other successful cyberattacks that occurred in the last year impacted thousands of organizations, affected hundreds of millions of people and cost billions of dollars. As the data center becomes more complex and dynamic, growing in scale to match the business requirements and relying heavily on diverse and relatively new cloud and virtualization technologies, the task of securing it is going to become even more difficult.

Virtual Environments and Application Security

The National Institute of Standards and Technology, a division of the U.S. Department of Commerce, is well known in the security community for its standards and recommendations that guide many organizations towards secure culture, policies and technological infrastructure. Its recently publicized guidance, the Application Container Security Guide, analyzes the unique risks posed by containerized applications and advises organizations how to secure them. The first recommendation, “Tailor the organization’s operational culture and technical processes to support the new way of developing, running, and supporting applications made possible by containers,” sets the tone for analysis, implying that modern data centers require a major shift in enterprise strategy and means of securing them, in order to keep pace with the new methodologies of developing and running applications.

The document goes on to emphasize that securing the data center requires tools that were designed from the ground up for this purpose. The authors explain that existing security tools are simply not up for the task of securing the virtualization-based infrastructure, as they were designed before such an environment was envisioned.

The NIST document identifies several areas in which the use of containers introduces new elements into the data center, each with its own security risks:

  1. Images: The static archives that include all the components used to run an application.
  2. Registry: The service that allows developers to store images as they are created, and to tag and catalog images for identification, version control, discovery and reuse.
  3. Orchestrator: The service that enables DevOps personas or automation tools working on their behalf to pull images from registries, deploy those images into containers and manage the running containers. This deployment process is what results in a usable version of the app, running and ready to respond to requests.
  4. Container Runtime: The virtualization layer that isolates different application components from each other and from the host’s operating system, while enabling them to utilize the local resources like CPU, memory, file system and network interfaces.
  5. Host Operating System: The operating system of the “actual” computer on which the container runtime is executing.

If the security risks related to the new elements of the container-based data center are not enough, all virtualization technologies and cloud deployments also impose new risks on the traditional elements of the data center. Unless the existing security and operations methodologies and tools are adapted to the new reality, these new security gaps leave the data center vulnerable to attacks.

Critical Applications in Your Data Center

As the architecture of the data center, as well as the applications running in it, are inherently distributed, a crucial element to the operation of the applications is the network that enables all their components to interact with each other. At first glance, security risks related to the network are well known and there are good traditional practices and tools to alleviate these risks. However, in a virtual-deployments world those risks are exacerbated.

In the modern data center, different applications share the same virtual network. At the same time, in most container and virtualization runtimes, by default, individual application components can access each other and the host OS over the network. For example, a public-facing web server and an internal database containing sensitive information are using the same virtual network. Therefore, sensitive internal applications may be exposed to greater risk from network attack: If the web server is compromised by an attacker, the attack can be extended over the internal network of the cluster to infect the database and access the data in it. It is not enough to secure the perimeter of the data center, as malicious traffic may be directed to critical applications from other applications and containerized components inside the data center.

A well-established security practice to address this issue is to create and enforce network segmentation through policies that prohibit connectivity between different sets of applications. However, in a modern data center with a large and continuously changing collection of applications and microservices, this becomes a difficult task. It is hard to identify and keep track of thousands of applications’ services that are being added, removed or modified at the fast rate expected of DevOps teams. Rogue (unplanned or unsanctioned) containers and virtual machines may become a common occurrence, especially in development environments, where developers may launch containers as a means of testing their code. If these hosted components are not scanned for vulnerabilities and checked for proper configuration, they may be susceptible to exploits. They may persist in the environment without the awareness of development teams and security administrators, be swallowed in the sea of other containers and virtual machines and become a foothold for attackers inside the organization.

Egress network access is also more complex to manage in a virtualized environment than in the traditional data center because so many of the connections between applications components are virtualized. Thus, traffic from one container to another may appear simply as encapsulated packets on the network without directly indicating the ultimate source, destination or payload. Tools and operational processes that are not aware of the virtualization context are not able to inspect this traffic or determine whether it represents a threat.

Principles for an Updated Security Methodology

The NIST security recommendations for application container security emphasizes that containers represent a transformational change in the way applications are built and run. They do not necessitate dramatically new security best practices; however, well-established techniques and principles should be updated and expanded to take the risks particular to container technologies into account. These necessary changes apply equally well to other forms of virtualization environments and cloud-based infrastructure.

Security processes and tools must be able to work effectively at the scale of the virtualized data center. Security teams faced with a high rate of change in the environment must be assisted with tools that have full visibility into the detailed data center structure and the network interactions between its components while simplifying the human experts’ task of creating, maintaining and enforcing security policies. Some of the burdens of creating such policies may shift to the application developers, who have the best understanding of the necessary communication patterns of specific applications, while operations and security teams must still ensure conformance to global policies. This implies that a shared view of the data center must be presented to, and understood by, all stakeholders.

Security must be as portable as the containers and virtual machines themselves, so organizations should adopt techniques and tools that are open and work across platforms and environments (for example, with different cloud providers and virtualization technologies).

The security challenges in the new data center are an opportunity to improve its security.

Alcide sponsored this post.


A digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.