Amazon Lambda was announced last week at the Amazon Web Services (AWS) re:Invent 2014 conference. The service drew excitement and interesting opinions from all around the Web. The service promises developers the ability to focus on the projects they are doing and not the infrastructure that has historically been a constant consideration. Among many highlights of the conference, the statement from Senior Vice President of Web Services Andy Jassy on “cloud is the new normal” stood out as the testament to the state of cloud adoption.
Developing software as an aggregation of events that respond to change in data or state is not a new trend. It originates from the traditional UI development model, remember Swing UI, Visual Basic Component model and in the recently debated – Event Driven SOA.
Lately, event driven programming and related architecture has been positioned as the next evolution of SOA. Traditionally SOA has embraced a request-response model and is largely synchronous in its design. The event based model introduced asynchronous, push and pull, fire and forget methodologies. Most of the developers today are doing some form of event driven programming like setting onclick events in jQuery, AJAX callbacks etc. So, this is not an entirely new idea to bend our head around.
However, functional reactive programming takes this to the next level. By idealizing continuous event streams and building subscribers to the event streams, it simplifies the idea of event driven systems. For developers, it is about minimizing the moving parts of building large scale event driven systems.
So Where Does Lambda fit in and Why is This Exciting?
The advent of AWS in 2006 and later cloud native services, democratized distributed programming to the masses. AWS flagship customers such as Netflix, Animoto and Pinterest educated a new era of developers, architects and system administrators on what is possible from building on the cloud. Thousands of developers evolved from just using EC2 and S3 as a drop-in replacement for virtual servers and file storage, to leverage stateless infrastructures that advocates “replacement as against repair”. This evolution in thinking led to mass adoption of ideas around immutable infrastructure, automated provisioning alike.
A definition of a modern online application is now a collection of web services that persist their state outside itself, run independently on independent infrastructure, can be scaled horizontally and upgraded to avoid minimum or no downtime to the end user. Expecting a ridiculously high availability like eleven 9’s of AWS S3 became a common parlance among developers who were otherwise never introduced to this concept except if they were working at the likes of Google, Facebook, Yahoo, Amazon and other flagship large scale web platforms. Large scale distributed applications became the new norm.
I sense Lambda could bring the same advancement to the general community. Making reactive programming, event driven systems and functional thinking a new norm would mean a greater adoption of these ideas in the mainstream for enterprises and startups alike.
Introducing the Lambda Service Model from AWS
The Lambda service could not have arrived earlier, as it needed the foundation of other scalable, highly available and durable services. For IoT enthusiasts and practitioners, the prospect of having your code run in the legions of AWS infrastructure and be available in action micro seconds after an event occurs makes for an attractive developer environment.
However, this open invitation also demands strong discipline in how code runs and reacts to events. It enforces the need for having the lambda function as a stateless entity. It has to assume that the local file system access, a particular child process, etc. may not exist beyond the realm of the current request.
The restriction on Lambda function is currently set to 60 seconds. That is another constraint and an opportunity to build services that have the right set of granularity on what action it could perform on the event. This will also help developers put restrictions on the composition of each lambda function built to react to event. Architects would need to look at a way at decomposing a business problem into event emitters, event consumers or sinks, and event channels.
The Lambda billing model is also an indicator on how AWS wants developers to think about services. A billing parameter set to duration of the service and the size of the memory allocated is a new model to rethink function / service composition.
As a developer, I would want to optimize on how long a lambda function takes to execute along with the memory I provision for the same. A smaller micro service should relatively complete the execution of the function in milliseconds with minimum memory at its disposal. Consider this an another methodology helping architects to rethink their microservices architecture composition.
Issues and Opportunities
The rise of Lambda adoption will further require a broader range of supporting infrastructure especially for managing the lifecycle of such functions. The dashboard functionality offered by AWS at launch time only offers typical listing, deletion, updation and basic monitoring of these functions. Creation of such functions are currently offered through an in-line, console-based editing tool or by uploading a compressed Node.js function along with the dependencies. However, I foresee an immediate integration with the current editors like Eclipse, IntelliJ, Sublime, TextMate etc. Build and deployment pipelines like newly announced CodeDeploy and pre-existing toolsets like Jenkins would have to be integrated with the deployment model of Lambda functions. The AWS Lambda API offers an upload-function API to upload the ZIP files of the function code.
The myriad of options that a developer has with this new offering makes for some interesting choices to make. Building a truly elastic, highly available and scalable application would mean deciding upon what remains in the traditional EC2 style compute instance and what goes into Lambda functions.
This also means an invitation to new era of architectures and patterns that would get infused in the community on the lines of Netflix OSS.
AWS along with other players like Google and Microsoft strived to make the cloud the new normal. With Lambda, AWS has put forward its step towards making the reactive, event driven systems and functional approach to software development. It’ an effort to make it a new normal.
For those lucky few who got early access to Lambda, its an invite to experience the future. And for the rest, till the time you get an invite, an AWS Blog post by Jeff Barr is the best thing to get you excited and started.
Feature image via Flickr Creative Commons.
Vivek Juneja is an engineer, based out of Seoul, and focused on cloud services and mobility. He currently works as a solutions architect at Symphony Teleca, and is a co-founder of the Amazon Web Services User Group in Bangalore. He started working with cloud platforms in 2008, and was an early adopter of AWS and Eucalyptus. He is also a technology evangelist and speaks at various technology conferences in India. He writes @ www.cloudgeek.in and www.vivekjuneja.in and loves grooming technology communities. You can also reach him by email:email@example.com
The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: Docker.