TNS
VOXPOP
What news from AWS re:Invent last week will have the most impact on you?
Amazon Q, an AI chatbot for explaining how AWS works.
0%
Super-fast S3 Express storage.
0%
New Graviton 4 processor instances.
0%
Emily Freeman leaving AWS.
0%
I don't use AWS, so none of this will affect me.
0%
Cloud Services / Serverless

The Central Tenets of Serverless Infrastructure

Regardless of cloud platform, IT professionals should adopt the following tenets to realize the full benefits of serverless.
Apr 22nd, 2021 1:00pm by
Featued image for: The Central Tenets of Serverless Infrastructure
Feature image via Pixabay.

James Kobielus
James is Principal Analyst at Franconia Research.

Many enterprises are building their IT modernization, digital transformation and microservices initiatives on serverless platforms.

Serverless platforms are well-suited to workloads that are on-demand, event-driven, asynchronous, stateless and scalable. They are also optimal for cloud workloads that experience infrequent, unpredictable surges in demand. Common examples of workloads suited to serverless include low-latency processing of real-time data streams, batch processing of incoming image files, application of incoming changes to a database, and execution of business process workflows.

Serverless computing — sometimes referred to as “functions as a service” — lets developers build and run applications without having to manage servers. Serverless code often runs in highly parallelized runtime containers, for a very short time. It is event-triggered, so that it can run automatically when needed. And it is fully managed by a cloud provider, so that users only pay for the resources when needed; not, as with traditional cloud computing, for always-on apps and servers.

Developers call serverless apps through APIs provided by the cloud platform. When deployed to serverless platforms, functionally programmed code spares developers from needing to write the custom logic that manages containers, virtual machines and other backend runtime engines to which execution of microservices will be dynamically allocated. It’s even possible to build hybridized cloud apps that orchestrate across serverless functions and other code that runs in containers and other microservices environments. Typically, this approach, sometimes called “backend-as-a-service,” gives developers API-based access to a variety of third-party services and apps.

Leveraging Serverless in the Public Cloud

Public cloud providers play a pivotal role in the deployment of serverless applications for enterprise customers. Increasingly, enterprises are using serverless offerings from Amazon Web Services, Azure Functions, Google Cloud, IBM Cloud, and other public cloud providers. There is also a growing range of commercial and open source serverless platforms that are suited for on-premises, hybrid cloud, and multicloud deployments.

When adopted through fully managed public cloud offerings, serverless lets enterprise IT groups shift strategically toward an entirely outsourced and automated capability, that some refer to as “NoOps.” Delivering on the serverless promise requires that cloud service providers manage their operations with strict service levels. These include the need to support 24×7 lights-out operation, deliver acceptable service levels around the clock, ensure elastic growth in line with enterprise requirements, process a very high volume of traffic with minimal delays, and keep the cost of provisioning on-demand resources to a bare minimum.

Delivering on the Promise of Serverless

Whether they go with a fully managed public cloud offering or choose to implement these capabilities entirely within their enterprise data centers, IT professionals should adopt these tenets to realize the full benefits of serverless.

TENET DEFINITION
Utility API=based access to core serverless infrastructure functions is paramount. Cloud service providers should provision serverless database management and transaction processing resources that are location independent. These resources should be decoupled from any physical infrastructure. They should also lack synchronous dependencies on other systems and be available on-demand, through abstract service contracts available through a standard API. The provider should support a serverless programmability framework that enables development of orchestrated serverless functions, alongside other microservices, through an abstraction layer. The framework should allow developers to coordinate resource-intensive serverless functions via workflows that are complex, hierarchical, latency-sensitive, multi-step and long-running.
Scalability Cloud service providers should design their serverless environments to scale instantaneously for any volume, variety and velocity of workloads. They should offer the ability to dynamically manage the allocation of data and other back-end machine resources, for executing serverless application business logic. These tools should enable scaling of the density and efficiency of compute, storage, data processing and other resources. IT personnel can define triggers that initiate orchestration and autoscaling of serverless functions. Enterprise IT staff, or the cloud provider that’s managing serverless infrastructure on users’ behalf, should be able to automatically shut down serverless functions when workload processing is complete.
Granularity Granularity in cloud resource provisioning supports consumption-based pricing of serverless offerings. Providers should build serverless platforms that enable incremental compute, storage, processing and networking capacity billed on a pay-as-you-go basis. The provider’s programming framework should apply design modular serverless functions. They should also meter serverless application resource consumption within event-driven execution models. This ensures that when a serverless function is sitting idle, it may be offered to users at zero incremental cost.
Abstraction Cloud service providers should provide a serverless abstraction framework that decouples front-end development from back-end provisioning of hardware and software resources. This framework should support programmatic access to the continuous integration/continuous deployment workflow within which serverless applications are developed, tested and deployed into production environments. The framework should also support “NoOps” automation of back-end serverless infrastructure and application monitoring and management.
Speed Cloud service providers should optimize their serverless platforms to enable delivery of real-time, low-latency, streaming, and continuous processing applications. To ensure continued high performance, providers must dynamically allocate access to cloud compute, memory, storage and other resources. They should also provide enterprise IT professionals with dashboards for monitoring and optimizing the execution of serverless functions on their platforms.
Robustness Cloud service providers should ensure that their environments can guarantee the most secure, reliable, stateful and transactionally consistent functionality required by serverless applications. They should manage all security patches, load balancing, capacity management, scaling, logging, and monitoring.

Conclusion

Enterprises should evaluate public cloud providers based on their compliance with these tenets. Taken together and incorporated into the providers’ operating procedures, these principles ensure that customers can achieve the full outsourcing and automation benefits associated with NoOps for cloud computing.

Complex multicloud serverless environments are coming fast to enterprises everywhere, so there’s a growing need for a common framework such as this for serverless cloud administration. According Evan Weaver, CTO at Fauna:

“Serverless computing means you don’t have to think about where servers are located or how they’re operated behind the scenes. Serverless means the physicality of your infrastructure is completely abstracted away. It reduces integration friction and vendor lock-in across the board. Pay-as-you-go means you don’t have to spend time capacity planning to minimize your costs; you can focus exclusively on building a well-architected application. And with proper infrastructure behind the scenes, serverless is cheaper both for the vendor and the customer, resulting in real and substantial cost savings and productivity benefits at the same time.”

CIOs and other enterprise IT professionals should make sure that disparate cloud administrators implement these serverless tenets in a consistent fashion. These tenets should be used in planning and managing complex cloud environments in order to prevent the dilution of serverless application service levels that are experienced by end users.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.