Confidential Computing Makes Inroads to the Cloud
The security of data in the cloud has prevented many companies from moving sensitive workloads to the cloud. But cloud providers and chip makers are hoping to tackle that concern with “confidential computing.”
The concept revolves around protecting the data when it is moving between systems, at rest in storage, or being processed in servers or in the cloud.
Cloud providers and chip makers are adding more security mechanisms at each of these stages to protect data from being stolen. The companies are layering more protected enclaves and security blocks to attest and verify that data and workloads are being accessed by authorized users.
All the major chip vendors, including Intel, Advanced Micro Devices, ARM and Nvidia are adding new features to their hardware and software portfolios to secure data within data systems. The chip maker’s confidential computing offerings will mostly be available through virtual machine instances from cloud providers from Google Cloud, Microsoft Azure and Amazon Web Services.
The top cloud providers also have their own confidential computing offerings tuned to cloud native software stacks. AWS this year added new security features to its Nitro V5 chipset for virtual machines offerings with its Graviton chipsets. Microsoft and Google offer Pluton and Titan, respectively, to authenticate users accessing cloud services.
In October, Google, Nvidia, AMD and Microsoft teamed up to form a confidential computing specification called Caliptra, with the goal to drop security assurances directly into the silicon. Development of the specification will be managed by the Open Compute Project, and the hope is that the features will be implemented by server and cloud providers to prevent data from being stolen.
The specification was partly motivated by the Spectre and Meltdown vulnerabilities discovered in 2018, which left data exposed to being stolen while it was in transit on chips. Caliptra involves the creation of security assurances through special silicon security blocks, ensuring trusted boot code, and prevention of electromagnetic attacks. Another key feature of Caliptra is authenticating code and verifying users trying to access applications in a secure enclave, which is called attestation.
Trusted Execution Environments
Microsoft’s Mark Russinovich, chief technology officer of the Azure cloud service, has been a big champion of confidential computing at all phases of data movement. At the Microsoft Ignite conference last year, he called confidential computing the “ultimate in data protection.”
The pronouncement was followed by the introduction of new Azure virtual machine offerings for cloud native workloads, which have non-exportable encryption keys in secure pop-up enclaves where data is transient and not retained. These enclaves are called Trusted Execution Environment (TEE), which are more like black boxes that hold encrypted data, which are locked in a protected space and accessible only to authorized users. The users can unlock the data with the right credentials.
The secure enclaves provide multiple benefits for applications like AI, where multiple data sets can strengthen learning models. For example, banks will be able to bring in third-party data sets to strengthen their proprietary models that are critical to their business and services. The third-party data sets can be removed from the enclave once the data is incorporated into the learning model.
The Azure confidential computing offerings revolve around on-chip features provided by AMD in its Epyc processors. The SEV-SNP (Secure Encrypted Virtualization-Secure Nested Paging) feature allows users to place a code inside an enclave that serves as a lock. A user can put in that code to unlock the enclave and access the data. The AMD feature encrypts the virtual machine, and the data in it can’t be accessed by the hypervisor.
Confidential computing went mainstream in the cloud last year, and traditional infrastructure vendors will follow suit in 2023, said James Sanders, who is the principal analyst for cloud, infrastructure, and quantum at CCS Insights.
“Confidential computing technologies such as AMD’s SEV can greatly ease the security story in that transformation,” Sanders said.
Intel’s move into confidential computing is through a cloud-based offering called Project Amber, which is an attestation service that verifies hardware and software assets in a computing environment, which creates a safe zone on which companies can run software or AI models. Project Amber verifies the integrity of code, data and computing endpoints on a network. It will work independently of cloud providers but relies on features that Intel provides on its chips that include SGX (Software Guard Extensions), which is a secure zone on its chips, and Linux middleware called Gramine, which will allow the kernel to run unmodified Linux applications in the SGX enclaves.
Intel soon will also release its new server chips called Sapphire Rapids, which include confidential computing features. The chips have new instructions called TDX (Trust Domain Execution) which secures virtual machines as a trusted enclave. The idea is to prevent hypervisors, which manage virtual machines, from reading data in the virtual machines via encryption.
When combined, Project Amber with TDX and SGX will allow only verified data to enter a secure enclave. Amber will be able to identify if data was hacked in transit to the enclave. Attestation provides the security to software machines as silicon does for hardware processors, said Steve Leibson, principal analyst at Tirias Research.
Nvidia has developed an AI-based software stack called Morpheus that analyzes user behavior to secure networks. For example, the overarching model aggregates user behavior on a network. Over time, the AI can establish a pattern of behavior for a user. If the AI spots unusual behavior that goes against the model, which could be unusual types of login, it alerts the security operations, which can then triage the problem.
“Morpheus gives new tools to do more, like log parsing, and to provide better visibility to the security analysts as another layer of security,” said Justin Boitano, vice president and general manager of Nvidia’s enterprise and edge computing operations.
ARM, HPE and Dell
ARM has major plans for confidential computing this year. ARM has created a dedicated, confidential computing architecture in its chip designs with a feature called dynamic “realms.” The technology isolates specific programs and data in different environments with “secure wells” to prevent hackers from accessing data. ARM over the next two years will release hardware and software data so chip makers can create hardware and software makers can write code.
The delivery of confidential computing will largely be through cloud services, but traditional infrastructure providers will adopt confidential computing in hybrid cloud models for customers who prefer control over their own IT infrastructure.
HPE, with its GreenLake multicloud offering, and Dell, with APEX, could improve as-a-service models intended for single tenants to adopt confidential computing features, but it could require a lot of work.
“Datacenter operators and MSPs stand to benefit from the same technology and financial model offered to enterprise customers, though making this multitenant to serve these new constituencies will require a significant software development effort to ensure security continuity,” CCS Insights’ Sanders said.