Will real-time data processing replace batch processing?
At Confluent's user conference, Kafka co-creator Jay Kreps argued that stream processing would eventually supplant traditional methods of batch processing altogether.
Absolutely: Businesses operate in real-time and are looking to move their IT systems to real-time capabilities.
Eventually: Enterprises will adopt technology slowly, so batch processing will be around for several more years.
No way: Stream processing is a niche, and there will always be cases where batch processing is the only option.
Software Development

Netlify’s Approach to the CDN, Microservices, and Breaking Down Your Monolith

May 1st, 2018 9:44am by
Featued image for: Netlify’s Approach to the CDN, Microservices, and Breaking Down Your Monolith

Netlify’s Approach To The CDN, Microservices, And Breaking Down Your Monolith

On today’s episode of The New Stack Makers podcast, TNS founder Alex Williams explores how Netlify created its content delivery network, the rise of microservices and AWS Lambda, and how working at scale has impacted today’s developers and their workflows. Williams was joined for this interview by Matt Biilmann, Netlify CEO and co-founder and Chris Bach, Netlify president and co-founder during SXSW.

When highlighting his background, Biilmann explained that the features that were once commonly requested by developers are now things of the past, as newer and more efficient technologies take their place.

“The way front-end developers work has fundamentally changed. When I started Webpop, the number one feature request, no comparison, was for FTP access. Now it’s like, what? Why would you even want an FTP? Git became this massive phenomenon not only for version control, but for developers to interact. The browser became so much more powerful than it had before. Services you could call directly from the browser, and front-end build tools completely changed how front-end developers work.”

Bach went on to note how Git-centered workflows and microservices have, “Changed the name of the game.”

The team at Netlify had to consider how it put together its build tools very carefully. Biilmann highlighted that when creating its CDN, Netlify wanted to serve the actual HTML files with the ability to do automatic deploys with instant cache validation. By making use of snapshots of consistent state, Netlify keeps a point of reference for the cache to trigger the next version live.

“We are very opinionated about the architecture. The architecture is part of this cache guarantee, you’re never going to have content that’s rendered on the fly by a server talking to a database. You’re going to have a build process that says, ‘Now the new version is ready,’ and you’re going to do a release of that. Once you buy into that architecture, then we can give you a workflow around it that’s completely automated, where you don’t have to think about caching and caching automation, and we can automate that for you,” said Biilmann.

In this Edition:

4:05: The issues with monolithic apps and scalability
6:23: How does that separate the presentation layer from the back end and the incremental update happen?
9:05: Cache and resource-driven methodology
13:06: Netlify’s creation to aid in developer workflows
13:45: Exploring Netlify’s underlying architecture
15:28: How Netlify is using AWS Lambda

Group Created with Sketch.
TNS owner Insight Partners is an investor in: The New Stack.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.