Why Securing Secrets in Cloud and Container Environments Is Important – and How to Do It
Cloud Native Computing Foundation sponsored this post, in anticipation of the virtual KubeCon + CloudNativeCon North America 2020 – Virtual, Nov. 17-20.
Key-based “secrets” are required to authorize access and communications across all cloud applications and services, including login access to containerized applications. From the AWS IAM access key, to Google API access token, Facebook access token, OAuth Client Secrets, and countless others, secrets are used to secure myriad public-facing services and internal or external REST APIs.
Given the access that secrets enable, their secure storage and management is absolutely essential to overall data and system security. Services naturally require access to sensitive data; because of this, even the most carefully configured environments will fail to fully protect sensitive data if secrets are exposed. The consequences of secrets falling into the wrong hands can be devastating, allowing attackers to create data breaches by reading database records; and wreak havoc by deleting files or adding their own.
The High Cost of Compromised Secrets
The dangers of secrets exposure are real, and continue to make headlines. Take the example of an engineer at Uber who accidentally left a secret available in a GitHub repository, allowing access to an Uber Amazon web server. An attacker proceeded to download files from the server, including a sensitive backup file containing the data records of 50 million Uber customers. The attacker demanded a $100,000 ransom, which Uber paid. When the story later became public, Uber also paid a settlement of $148 million for the data breach, and agreed to comprehensive security compliance auditing of all company operations going forward. The impact also included a delay to Uber’s IPO – all stemming from a poorly managed secret.
In another example of attackers creating mischief using compromised secrets, developers at DXC made the error of hardcoding keys — which allowed access to Amazon Web Services resources in a project. A team member then shared the project using an unsecured GitHub repository. Attackers utilized these (not so) private keys to spin up 244 AWS virtual machines over four days, costing the company $64,000.
Secrets Management Solutions
A variety of tools are available for managing secrets security, including several strong open source options.
AWSLabs’s open source git-secrets protects organizations from scenarios like those mentioned above, preventing developers from committing passwords and other sensitive information to git repositories.
The tool detect-secrets is designed to detect and prevent secrets from being entered into a codebase. It also detects if its prevention rules are being bypassed, and offers a checklist of any secrets that need to be migrated to secure storage.
HashiCorp’s Vault tool provides secure storage and access controls for secrets and other sensitive data, leveraging a UI, CLI, or HTTP API.
Organizations utilizing Kubernetes with their container environments can leverage Kubernetes Secrets, the orchestrator’s built-in secrets management solution. Kubernetes Secrets facilitates the storage and management of passwords, OAuth tokens, SSH keys, and other sensitive information; enabling greater safety and flexibility (versus storing secrets in a Pod definition or container image).
Red Hat OpenShift, another popular enterprise-grade container platform, features built-in secrets management functionality as well.
Secrets Auditing Solutions and Defense in Depth
In containerized environments, secrets auditing tools make it possible to recognize the presence of secrets within source code repositories, container images, across CI/CD pipelines, and beyond. Deploying container services will activate platform and orchestrator security measures that distribute, encrypt and properly manage secrets. By default, secrets are secured in system containers or services — and this protection suffices in most use cases.
However, for especially sensitive workloads — and Uber’s customer database backend service is a strong example, as are any data encryption or standard image scanning use cases — it’s not adequate to simply rely on conventional secret store security and secret distribution. These sensitive use cases call for more robust defense in depth protections. Within container environments, defense-in-depth implementations leverage deep packet inspection (DPI) and data leakage prevention (DLP) to enable secrets monitoring while they’re being used. Any transmission of a secret via network packets can be recognized, flagged and blocked if inappropriate. In this way, the most sensitive data can be effectively secured throughout the full container lifecycle, and attacks that could otherwise result in breach incidents can be thwarted due to this additional layer of safeguards.
Securing Cloud and Container Application Secrets Is More Vital than Ever
The COVID-19 pandemic has tremendously increased the use of remote access work applications. As a result, challenges to the security of these applications are on the rise right now. By implementing effective secrets management and auditing tools, as well as defense in depth to secure the most sensitive workloads, organizations can achieve successful security and keep their secrets to themselves.
To learn more about Kubernetes and other cloud native technologies, consider coming to KubeCon + CloudNativeCon North America 2020, Nov. 17-20, virtually.
Amazon Web Services, the Cloud Native Computing Foundation, HashiCorp and Red Hat are sponsors of The New Stack.
Feature image via Pixabay.