OAuth Security in a Cloud Native World
These days most software companies use cloud deployment with modern hosting capabilities that make everyone productive, from developers to DevOps and InfoSec staff.
However, not all cloud deployments are the same, and you still need to make sound choices to meet your architectural requirements.
In this article, I will explain how my thinking has evolved after working with various cloud deployment types and integrating security into many kinds of apps.
I will start with a discussion on APIs and then highlight the key supporting security components. One of the most important of these is your identity and access management (IAM) system.
The OAuth Framework
Nowadays, most application-level components implement security using the OAuth family of specifications, which provides modern security capabilities for web apps, mobile apps and APIs. This provides companies with the most cutting-edge options for authenticating users with one or more proofs of their identity, and protecting data in APIs according to business rules.
The authorization server defined in the OAuth specification deals with authentication, token issuing and user management. It enables many security solutions, or “flows,” to be built over time. It’s the heart of any modern IAM system.
Platform as a Service (PaaS)
When I first started using cloud deployment, like many people, I was attracted by the thought of not having to host any backend servers and using the cloud infrastructure as a black box instead. For a single page application, this might lead to the following backend components that use PaaS:
Technologies like serverless enable you to develop APIs that use PaaS hosting. This can be a cost-effective solution for small startups or for developers to host their own solutions. Meanwhile, developers can use the cloud provider’s built-in authorization server when getting started with OAuth integration. This is sometimes referred to as Identity as a Service (IdaaS).
Your APIs or microservices are your core intellectual property (IP), and most companies implement them in a mainstream programming language, such as Java or C#. In doing so, organizations will want to leverage these technologies to their full capabilities without restrictions. In addition, code should be kept portable in case you want to use multiple cloud providers in the future. This can enable you to extend your digital solutions to emerging markets, where certain cloud providers may be blocked.
One downside to using PaaS for APIs is that you may run into limitations that lead to vendor lock-in, making it expensive to migrate APIs to another host in the future. Some compute-based API hosting may also have other limitations. For example, in-memory storage may be impossible if a system must spin up a new API instance for every request. These issues can add complexity and work against your technical architecture.
You must also control which API endpoints are exposed to the internet and secure the perimeter in your preferred way. A zero-trust approach is recommended for connections between APIs, as it can enforce both infrastructure and user-level security. Finally, APIs connect to sensitive data sources, so they should be hosted behind a reverse proxy or API gateway as a hosting best practice. This makes it more difficult for attackers to gain access to that data.
Container as a Service (CaaS)
These requirements lead many companies to host APIs using a different cloud building block. Although virtual machines used to be more common, container orchestration platforms such as Kubernetes now provide the best API hosting features. This creates an updated deployment picture for APIs, where they are hosted “inside the cluster” while you continue to use PaaS for some other components:
Once API hosting is updated to use container-based deployment, there are no restrictions on code execution, and you have a portable backend that can be migrated between clouds. Your technical staff will also learn how to use modern patterns that deal with deployment and availability in the best ways. You then need to think more about other critical components that support your APIs.
Cloud Native Authorization Server
As you integrate OAuth into your applications and APIs, you will realize that the authorization server you have chosen is a critical part of your architecture that enables solutions for your security use cases. Using up-to-date security standards will keep your applications aligned with security best practices. Many of these standards map to company use cases, some of which are essential in certain industry sectors.
APIs must validate JWT access tokens on every request and authorize them based on scopes and claims. This is a mechanism that scales to arbitrarily complex business rules and spans across multiple APIs in your cluster. Similarly, you must be able to implement best practices for web and mobile apps and use multiple authentication factors.
The OAuth framework provides you with building blocks rather than an out-of-the-box solution. Extensibility is thus essential for your APIs to deal with identity data correctly. One critical area is the ability to add custom claims from your business data to access tokens. Another is the ability to link accounts reliably so that your APIs never duplicate users if they authenticate in a new way, such as when using a WebAuthn key.
All of this leads to the preferred option of using a specialist cloud native authorization server. This is more efficient because the authorization server is hosted right next to your APIs. It also gives you the best control over security, limiting which authorization server endpoints are exposed to the internet.
As well as a hosting entry point, the API gateway (or reverse proxy) is a crucial architectural component. The API gateway can perform advanced routing and security-related tasks like token translation before your APIs receive requests. By externalizing security plumbing, your API code is simpler and more business-focused.
It is recommended to use the Phantom Token pattern so that internet clients receive only opaque access tokens. Unlike JSON Web Tokens (JWTs), which are easily readable, Phantom Tokens cannot reveal any private details that might disclose personally identifiable information (PII). When a client calls an API, the gateway can then perform introspection to translate from opaque access tokens to JWT access tokens. This flow is illustrated below.
There are many other gateway use cases, but a critical capability is running plugins that can perform both HTTP translation and routing as a single unit of work. There should be no limitations on the code you can write in the plugin. This is another area where cloud native solutions may provide better capabilities than the cloud provider’s generalist solution.
Best of Breed Components
The authorization server and API gateway are key security components, and some companies also use an entitlement management system for their business authorization. Meanwhile, additional specialized components are required to support your APIs. These also must be chosen wisely, based on the provider’s capabilities and your requirements.
Each company must decide which third-party components they need. For example, it is common to host individual components for monitoring, log management and event-based data flows alongside your APIs. A possible setup is shown below:
PaaS is still an excellent choice for some component roles, though, and these days I follow a “mix and match” approach. Components that are a vital part of your API architecture should be hosted inside the cluster. I often prefer a serverless approach for other components if it is easier to manage.
The classic example where PaaS works better than CaaS is when delivering static web content to browsers. A content delivery network (CDN) can push the content to many locations at a low cost to enable globally equal web performance. This is more efficient than hosting CaaS clusters in all of those locations. See the Token Handler pattern for further details on using this approach, while also following current browser security best practices.
Deployment and Operation
When companies are new to OAuth, there is often a fear that the authorization server could become unavailable, leading to downtime for user-facing applications. This concern remains valid, but when using cloud native APIs, you are already assuming this risk, and you should be able to follow identical patterns for third-party components. When using a cloud native authorization server, check that its deployment and availability behavior provides what you need.
Also, consider the people-level requirements. An InfoSec stakeholder will want a system with good auditing of identity events. These days DevOps staff should be able to perform zero-downtime upgrades of the authorization server or use canary deployment, where both old and new versions run simultaneously. The system should also have modern logging and monitoring capabilities so that technical support staff can troubleshoot effectively when there are configuration or connection problems.
Local Computer Setups
Companies need to push their software down a pipeline, and discovering issues early on saves costs considerably. The benefits of a productive developer setup are often overlooked, but it is an area where cloud native provides some compelling advantages.
A developer, architect or DevOps person can run most cloud native components on a local computer. This can be a great way to first test the cloud native authorization server and API gateway and design end-to-end application flow.
Operational behavior such as upgrades can then be verified early, using a local cluster. Once the system is working with the desired behavior, you can simply update your Docker-based deployment, and the rest of the pipeline will also work in the same way.
Cloud native architecture provides the most portable and capable platform for hosting and managing your APIs, but keep an eye on the important security requirements. This will lead you to choose best-of-breed supporting components and host all of them inside your cluster. Choose an authorization server based on the security features you need and review it from an operational viewpoint.
At Curity, we provide a powerful identity and access management system designed to be cloud native from the ground up. It also integrates with modern cloud native platforms. As well as having rich support for standards, the system is based on a separation-of-concerns philosophy and is extensible to provide customers with the behaviors they need. There is also a free Community Edition, and it is trivial to spin up an initial system using a Docker container.
As a final note, the security components in your cloud native cluster will enable many powerful design patterns. Still, good architecture guidance is also a key ingredient when building cloud native security solutions. Our resource articles, guides, and code examples provide many end-to-end cloud native flows to help you along the way.