TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%
Data

Boost Customer Engagement with Real-Time Data

A look at the hallmarks of a streaming data platform, the key to making it happen.
Jan 26th, 2023 8:42am by
Featued image for: Boost Customer Engagement with Real-Time Data
Image via Pixabay.

The real-time economy is coming — and Silicon Valley technology disruptors are sizing up your sector.

Gone are the days when “digital first” was sufficient for businesses: The online pool is crowded in the post-pandemic era with years’ worth of digital migration crammed into just a few months. More than half — 53% — of organizations now have an enterprisewide digital transformation strategy.

The differentiator is real time. Those who succeed will be what analyst firm IDC has called “digital-first aficionados” — those who innovate in customer engagement. This may be through their use of real-time data-management and -processing technologies to make tailored offers while customers shop or bank online, or their ability to identify and tackle fraudulent activity during a transaction or replace a component part before the machine breaks.

This kind of interaction is something at which Silicon Valley giants and increasingly digital pure-play startups excel. Amazon and Netflix, for example, redefined the customer experience in their respective fields of shopping and films by providing personalized services and tailored offerings in that precious first window of customer engagement.

In short: Understand the customer, deeply.

Amazon explains here that it’s not just about the challenge, but how to overcome its demands: “One of the problems with Amazon’s vast selection is it’s enormous. It becomes critical for Amazon to understand the customer deeply in order to make it easy for the customer to find what they want.”

In short: Understand the customer, deeply.

Achieving this means building a 360-degree view of the customer that is “fresh” with up-to-date insights based on real-time behavior and needs. Establishing that fresh view entails a combination of streaming event data — from actions such as website clicks, machine communications and device transactions generated in milliseconds that say what’s happening in the moment — with historical data in static systems that provide context and that help determine the value of those events. This must be delivered at the precise moment it’s needed to make it actionable, without waiting for a database to write the data.

Achieving this demands an IT infrastructure capable of real-time stream processing. However, while many may believe they are on the road to real time, they are mistaken.

De Facto but No Database

Apache Kafka has become a de facto standard for streaming event data in real time. This open source event-streaming technology has proved popular for its speed and ease of implementation among a particularly important category of IT adopters: developers. Streaming analytics firm Swim’s “State of Streaming Data” report found that nearly half of organizations are generating insight in streams, with Kafka the leading choice of enabling technology.

Significantly, Swim discounts a key enterprise data staple from the streaming technology stack that is a source of context-rich data important to building that 360-degree view: a fast data store.

As Swim CEO Ramana Jonnala notes, persisting data to “latency-prone data stores” would limit organizations’ “ability to promptly respond to and act on business-critical events.”

Building real-time applications means rethinking your architecture with a real-time stream-processing platform, something capable of ingesting and processing data at speed from different sources.

Extract, transform, load (ETL) is the hurdle: Writing large volumes of streamed data, running aggregations, processing and presenting results are historically done at the day’s end in batches but now as microbatches. Each step in this sequence injects delay, causing the data to lag further behind an event and limiting its value.

Building real-time applications means rethinking your architecture with a real-time stream-processing platform, something capable of ingesting and processing data at speed from different sources, be they streaming such as website transactions or static such as systems of record like CRM or other databases.

What are the hallmarks of that streaming data platform?

  • Streaming flow: This is “plumbing” that enables movement of data from sources to sinks — software capabilities that help connect and deliver data from various data sources emitting information referred to as events.
  • Streaming engine: This is key to ingesting, transforming, distributing and synchronizing your data. It should be capable of processing data as it’s generated and feeding the results to your analytics. The engine should be capable of continuously processing data in the stream and include capabilities such as windowing to look at data in a given period and watermarking to handle events that fall out of sequence. It should also feature the ability to restart jobs working only from snapshots of state for resilience, consistency and availability.
  • Data processing: Enrichment and processing of data at real-time speed demands fast and consistent distributed computation. This is a challenge when data must be processed across large environments where processing and network availability cannot be guaranteed.

The answer is to use the resources at your disposal: to harness pools of memory in local servers and clusters for processing so that data does not need to cross the network to a data center and you don’t need to beef up local processing with additional hardware. The streaming engine should be integrated with this computational layer for performance at scale. An in-memory architecture delivers the sub-millisecond responses required for real-time analytics, with millions of complex transactions performed each second to join stored data with streaming data.

  • Machine learning interface: The last mile in this map is machine intelligence. ML provides the potential automation to engage intelligently with customers and to conduct transactions at scale, yet around half of ML projects fail to make it from their pilot to production. Closing the gap means making the machine model operational. Achieving that takes an interface to a memory grid capable of both automatic parallel processing across clusters and of sharing the machine model in different pipelines. This delivers the scale, performance and reliability without additional coding or hardware.

Conclusion

Success in the real-time economy demands a new information architecture, one with real-time stream processing that powers turbocharged analytics, customer promotions, monitoring and more. It must do that during an incredibly tight window of time as inputs and outputs continuously change.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.