How to Secure Cloud Deployments by Keeping the Keys and Data in Different Places
Red Hat sponsored this post.
Distributed cloud environments represent both the current state and future of enterprise cloud deployments. At least, it certainly appears that way: The introduction of Google’s Anthos and Amazon Web Service’s (AWS) Outposts indicate that among the leading cloud providers, the hybrid cloud is here to stay as well. There also appears to be room for ample expansions of on-premises cloud platforms on offer by the cloud giants.
Yet, offloading applications and data to multiple public cloud providers can be fraught with risk.
First, enterprises are at the mercy of the security policies of their cloud partners. Organizations may feel that their data is automatically protected by their cloud vendors, but that’s not always the case. This came to light a few months ago during a major financial institution’s data breach, where a cloud vendor made it clear that it was not responsible for the bank’s data loss.
Second, it can be exceedingly difficult to gauge risk when working in a hybrid cloud environment. Once data leaves the secure premises of an on-site data center, it’s out of the control of the Chief Information Security Officer (CISO). Visibility into data decreases once that data leaves the secure premises of an on-site data center. Data in-flight can be particularly unnerving, as an organization can lose track of what’s happening with their organization’s data as it’s being transferred between on- and off-premise environments.
Fortunately, there’s a positive flip-side to the distributed cloud data security coin.
Even as multicloud environments can increase the complexity of data security, they can also significantly bolster organizations’ efforts to protect their data. Here is how: Enterprises can use distributed cloud architectures to store encrypted data and corresponding data encryption keys (DEKs) in separate locations. This keeps both the keys and data protected by eliminating a single point of compromise.
Let’s look at why this is important and how it can be put into practice.
Separating the Keys from the Ignition
Standards like Advanced Encryption Standard-256 (AES-256), which can be known as “not crackable” since the combinations of keys are massive, have made it increasingly difficult for attackers to access highly sensitive data.
But encryption like AES-256, RSA and others can also make it more challenging for the people who need the data to be able to access it easily — unless they have the appropriate DEKs. These might be public keys, which are used to encrypt (but not decrypt) messages and can be easily shared to let authorized people send information. Or, they could be private keys, which are meant for private individual use only and used to decrypt messages. They are used in public/private pairs to unlock and control the data encryption, making it impossible to decrypt the information without the use of the proper key. AES is a symmetric block cipher, while RSA is a public-key cryptosystem.
Regulatory standards like the Health Insurance Portability and Accountability Act (HIPAA), Sarbanes-Oxley (SOX) and the Payment Card Industry Data Security Standard (PCI-DSS) require that DEKs remain separate from the encrypted data they’re meant to unlock, for obvious reasons. Just like it’s never a good idea to keep a car’s keys tucked away in the visor, it’s important to keep data keys away from the information they’re meant to unlock.
This requirement poses its own challenge, however. How do you store DEKs and data separately? What’s the best and most efficient approach?
This is where the beauty of distributed cloud environments shines. Organizations can choose to keep their DEKs in-house (or at another cloud provider) while still storing their encrypted data offsite through Amazon Web Services, Microsoft Azure or a similar service. If their encrypted data were to somehow be accessed by an unauthorized third-party, that party would not have the key needed to decrypt the information. They’d gain access to nothing more than gibberish.
Keeping Up with the Keys
The downside is that enterprises that use multiple cloud providers, with many different data sets, also have a lot of keys to manage. These enterprises are likely using multiple key management systems (KMS) that have been generated by different vendors to protect and access various data sets. This is because they are using multiple clouds and native cloud offerings, such as Amazon Web Services or Google Cloud. They have different organizational silos, and while they demonstrate the vigor of actually using a KMS versus baling wire and duct tape, they don’t elevate to the level of having cross-departmental standardization.
Unfortunately, this fragmentation can create portability obstacles between cloud providers, as one system may not be able to interface with each cloud’s KSM to retrieve DEKs to encrypt data. They could have different APIs. Lack of portability can undermine the reasons to use a hybrid cloud architecture in the first place.
While attempts at standardization like the Key Management Interoperability Protocol (KMIP) are good and essential efforts, more could be done. Ideally, there should be an abstraction layer for KMSs so that organizations can easily use different key management systems across multiple clouds, similar to what is already being done for object storage systems.
Driving Better Data Protection
Keeping data encryption keys separate from the data they’re meant to encrypt is one of the most fundamental steps an organization can take to protect their information. Hybrid cloud environments make this easy by providing enterprises with a flexible option that can keep both their keys and data from being compromised.
Feature image via Pixabay.