Data / Service Mesh / Technology / Sponsored

Vendor Checklist for Real-Time Data Meshes

2 Jun 2022 6:20am, by

In Part 1 of this series, we highlighted the challenges of real-time data sharing. In Part 2, we defined the real-time data mesh and discussed the key tenets for incorporating them into modern IT stacks. This post focuses on what’s needed to effectively evaluate real-time data-sharing solutions.

Tim Wagner
Tim is the inventor of AWS Lambda and a former general manager of AWS Lambda and Amazon API Gateway services. He has also served as vice president of engineering at Coinbase, where he managed design, security and product management teams. Tim co-founded Vendia to help organizations of all sizes share data more effectively across clouds and companies, and he serves as its CEO.

Given its importance to organizations, broad scope and platform-centric nature of real-time data sharing in a modern IT environment, it’s especially important to evaluate and select vendors capable of delivering on the broad set of needs organizations require. This section can be used as a vendor checklist to ensure that solutions — whether developed in-house, outsourced to a consulting company or purchased from a vendor — provide the range of capabilities real-world challenges demand:

Multicloud and Software-as-a-Service (SaaS) integrations as fundamental features — The single, most fundamental feature of a real-time data mesh solution is that it can seamlessly span the most important divides in the modern IT landscape: multiple clouds and different SaaS applications, such as Salesforce and Microsoft Dynamics. Without this feature, much of the undifferentiated heavy lifting of data sharing will remain in the hands of the IT organization, radically blunting the effectiveness of real-time data sharing. Best-of-breed vendors will support data sharing across the major public clouds and application platforms with a connector strategy that makes ingress from and egress to other services and platforms fast to develop and easy to operate and maintain.

SaaS delivery model with a “zero infrastructure” footprint — IT organizations already struggle under the weight of managing too much hardware and software infrastructure, including their large compliance and security surface area. Data mesh solutions, because they can carry data with varying levels of exposure risk, have to be designed to handle the worst-case scenario, making securing their end-to-end infrastructure even more complex. Acquiring these capabilities through a SaaS solution that doesn’t expand a company’s infrastructure footprint is critical to avoiding direct and indirect cost and staffing bloat. Cloud-based SaaS platforms for data sharing also confer the economic benefits of shared development and deployment costs, further improving ROI versus in-house and outsourced manual development approaches.

Cloud native design with automatic scaling and fault tolerance — Of all the elements in an IT stack, the data-sharing layer is one that most benefits from an innovation-forward design. Solving for fault tolerance across multiple regions, clouds and departments/partners is challenging enough, and when combined with the need for dynamic scaling that is simultaneously highly available and cost-effective, most homegrown or consulting-based solutions peter out quickly. Cloud native designs incorporating the latest advances in serverless and container-based technologies offer the best options for creating a solution that offers maximum uptime without sacrificing price/performance ratios.

Cost-effective — Unlike blockchains and ERP systems that typically need to be “scaled to peak” and then kept there 24/7, real-time data mesh providers will employ scale-by-request, providing tightly enveloped costs that vary with actual usage rather than scaling with peak infrastructure capacity requirements.

Green tech with high utilization — As carbon footprint reduction becomes more critical to investors and public company reporting and transparency requirements start to expand, choosing “green tech” becomes ever more important. Fundamental to lowering carbon emissions is reaching high levels of data and compute utilization; otherwise, the majority of infrastructure capacity is spent worthlessly, creating a negative environmental impact — the worst possible outcome. Data mesh solutions based on modern, serverless technologies offer highly efficient, “100% utilization” solutions based on massively multitenanted cloud-based strategies. These benefits pass through to the companies that deploy them, resulting in significant carbon savings.

Compliant and secure by design — Data-sharing solutions are, because of their nature, the target of many regulatory, security and information-handling policies and controls. Building out not just the solution but the necessary monitoring, reporting and management capabilities needed to ensure constant compliance across assurance programs, such as SOC2, PCI, GDPR, CCPA, HIPAA, FedRAMP and more, is time-consuming and costly and typifies the sort of undifferentiated heavy lifting that can be transferred to a platform vendor whose sole job is to deliver on these outcomes 24/7.

Capable of permanent, durable storage — A viable real-time data mesh needs to be capable of more than just transiting data from point to point; it needs to be capable of storing an unlimited amount of information for an unlimited amount of time. Kafka and other streaming data solutions are fantastic building blocks for connecting systems together, but they lack permanent storage, cross-company (and cross-cloud) support and other facilities needed to span the necessary canyons.

Going Further

To go deeper, especially on the topic of “data mesh integrates the two planes of operational and analytical data and applications,” see the book “Data Mesh: Delivering Data-Driven Value at Scale” by Zhamak Dehghani, director of emerging technologies in North America at ThoughtWorks.

A Key Role in IT Stacks

Real-time data meshes play a key role in IT stacks … whether they appear implicitly and accidentally or explicitly and with thoughtful consideration. Platforms such as Vendia offer a leap forward from historical approaches to EAI/EiPaaS, ERP and first-generation blockchains by automatically creating a single source of truth from a standards-based data model and then managing it through a zero-footprint SaaS deployment.

Companies can leverage these benefits quickly by using it for application-to-application sharing challenges or by selectively migrating from existing file-based sharing approaches. Teams focused on innovation or broader operational data-sharing solutions can incorporate fine-grained data sharing for industry-spanning, best-of-breed practices when building supply chains, financial settlement systems or other use cases that benefit from cross-company “single source of truth” outcomes. Selecting a vendor to deliver on these outcomes requires understanding the breadth of challenges such a platform needs to support and benefits from the checklist provided above.

The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Real.

Featured image via Pixabay.