TNS
VOXPOP
Favorite Social Media Timesink
When you take a break from work, where are you going?
Instagram/Facebook
0%
Discord/Slack
0%
LinkedIn
0%
Video clips on TikTok/YouTube
0%
X, Bluesky, Mastodon et al...
0%
Web surfing
0%
I do not get distracted by petty amusements
0%
Data

Data Streaming, for When Micro-batching Just Isn’t Fast Enough

Micro-batching offers many, but not all, the advantages real-time data streaming. But in order to succeed, streaming data needs the same tooling that everyone expects from batch data.
Mar 7th, 2023 3:00am by
Featued image for: Data Streaming, for When Micro-batching Just Isn’t Fast Enough

Micro-batching is as close to real-time as batch can get, allowing latencies in mere minutes in some cases while running on the same infrastructure as longer latency batch systems. But it can only do so much.

When is the best time to move over to data streaming? It’s easier to look back and say, “streaming was a great choice” but much harder to actually make the investment in newer technologies.

Back in October, I had the pleasure of attending the debate “If Streaming is the Answer, Why Are We Still Doing Batch?” at Confluent‘s Current 2022 conference.

Moderated by Kris Jenkins, head of developer relations at Confluent, the panel discussion included Amy Chen, partner engineering manager at dbt Labs; Adi Polak, vice president of developer experience at Treeverse; Tyler Akidau, principal software engineer at Snowflake; and Eric Sammer, CEO of Decodable.

The talk, now available for viewing here, covered several talking points rich with debate and conflicting viewpoints.

Micro-batching: It’s Not a Dirty Word

Micro-batching isn’t cheating no matter what anyone says, but it does have severe limitations and it isn’t a real-time streaming system. Micro-batching is a low-latency batch system that can send batch data in mere minutes in some cases.

Chen explained, “folks are setting up micro-batches to stimulate a very similar streaming effect.” She went into a little detail about a customer in the Live Auction space. “The reason why they decided to go micro-batch is because it’s easier. They can do it on the same infrastructure as their batch processes.”

But she also admitted with this specific customer that micro-batching might not be the best use case, “they’re just going to run it, but in those cases, lower latency is needed.  It’s not you should have to wait around like once you’re trying to get [a service level agreement of under] two minutes.”

That particular use case hits it all with micro-batching. The pros: the same infrastructure and familiar frameworks and workflows meaning companies don’t have to provision new vendors. Nothing really changes other than the latencies. On the other hand, micro-batching can only go so fast and it doesn’t scale well.

There is a point where micro-batching outlives its useful lifespan. Of this Chen said, “at the end of the day becomes a conversation of labor-intensive versus resource-intensive. We’ve noticed as folks start to do more micro-batching but as the data scales up, it just becomes too expensive.”

Use Cases — Past, Present, Future

It’s certainly easier to look back and say, “oh yeah that’s a perfect use case for streaming,” than it is to push a market forward or create a new marketplace altogether. Take the example of high-frequency trading. “At some point, Bloomberg terminals got faster and faster and faster,” Sammer explained, “and then eventually somebody went like, ‘why don’t we make this a software application?’ and high-frequency trading was born.” Those first conversations probably weren’t so easy.

Chen brought this conversation into the more recent past — a real-time dashboard created for non-technical stakeholders to make real-time decisions. Her original sentiment was, “you don’t need a real-time dashboard.” But on the holiday weekend when the dashboard was active and the technical staff was not, an erroneous $1,000.00-off coupon hit the coupon site PayPal Honey.

For three hours clients placed orders that were later canceled. But the damage was minimized by the stakeholders’ ability to make real-time decisions. And it was likely one of the best ad campaigns for real-time dashboards for the stakeholders if ever there was one.

So what is it: streaming or batch?

Well. It’s both. Many clients want real-time but batch isn’t going away anytime soon, if ever.

Akidau often meets clients who want real-time but end up realizing they don’t need real-time when he explains the costs and complexity of real-time systems versus batch systems. Client wants versus their actual needs show that, “there’s not actually that many use cases that need sub-minute latency,” he said of the streaming use cases but agrees that, “it’s nice to have.”

So how can someone determine if real-time is a want or a need? Akidau said it depends on how they answer this one question, “how quickly do you need to take action?” and he provided a template for both scenarios, “if it’s interactive or if this needs to happen transactionally, that’s the case where real-time streaming makes sense. Anything that’s not that, and you’re probably not needing that kind of latency.”

Technology Gravity

Polak cites technology gravity for why streaming is still an uphill climb. Data gravity, more widely understood, is the ability for a body of data to attract other data. If customers are happy with Snowflake then it’s likely more customers will use Snowflake to manage their data.

Following that logic, technology gravity is similar but in respect to guardrails and processes that enable and enforce best practices for healthier data. Polak says, “The question is, what tools are going to enable us to still work with relatively similar aspects within the different teams.”

The concept of technology gravity adheres to the idea that existing skills will push a dev team forward more than the whole team learning completely new skills.

The detail about the streaming stack and Sammer’s comment again about streaming being “weird” follows the logic of technology gravity. Sammer, who firmly believes in streaming taking center stage, said the solution is to create a “shared conceptual model, schema, and time.” He recommended unifying all the concepts that are not necessarily streaming or batch specific.

Sammer said, “normalization of concepts, strict API definitions, and tooling that works that come with data quality should not be different for streaming and batch.”

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.