Redis Labs: Why Latency Shouldn’t Be an Issue in the Age of AI
During its annual RedisConf 2021 user conference, laid out its roadmap to extend its namesake in-cache memory database system to easily accommodate microservices and artificial intelligence-based workloads.
Its new Redis 7.0 features and other releases — most of which will become generally available (GA) during the first half of the year — should continue to build on the core NoSQL database functionalities. Improving database consistency and scaling capacities while maintaining ultra-low latency across multiple nodes on multicloud and on-premises environments also continues to be its main mission. AI will also be an increasingly important capability under the Redis umbrella.
Torsten Volk, an analyst for Enterprise Management Associates (EMA), said “coming from its original ‘real-time’ angle, Redis has the performance chops to address the performance challenges we experience when running microservices apps in hybrid environments, where enterprises often struggle to ensure consistent performance.”
“Running these distributed apps on top of Kubernetes clusters in public clouds, data center or edge locations can significantly increase operational risk, depending on the robustness of the app code against sudden latency and also influenced by the ability of corporate IT to provide intelligent deployment and configuration policies,” Volk told The New Stack. “Redis could simplify this whole mess simply by providing a federated data platform that brings consistent performance across app architectures and deployment models. A very interesting play indeed.”
During his talk “Redis 7.0 and Beyond,” Yossi Gottlieb, chief architect, Redis Labs, told The New Stack, said that while much of Redis Labs’ main mission is the “development of its core project,” and “empowering the community and getting all individual contributors to be active,” while Redis “is really more than just the core project.”
“I believe that part of the mission of scaling Redis is also scaling the ecosystem and getting different parts of the ecosystem to better complement and to be compatible with each other and basically be able to achieve more,” Gottlieb said.
The AI Equation
Redis has ramped up its AI support for feature stores — prepackaged reusable data transformations — through RedisAI. It is available for on-prem deployments now and will be available for Redis Enterprise Cloud in the second half of 2021.
As a case example, Redis 7.0, for example, can be used as a features store, while RedisAI applies ML for the inference functionality while ensuring data latency, while RedisGears can also be used for synching — all the while maintaining low latency — with an offline features store.
“WithRedis AI, you can bring inferencing much closer to the feature store, and effectively get lower latency feature serving and monitoring,” Taimur Rashid chief business development officer, Redis Labs, said during a RedisConf keynote. “And then finally, if you choose to you can sync with an offline store by using RedisGears.”
The Redis and JSON Marriage
Earlier this year, Redis Labs released version 2.0 of its RediSearch in-memory index, which the company claims offers over twice as much data throughput compared to the previous iteration. It also features additional ways to make it easier for developers to create and exploit indices. Among its capabilities, RediSearch serves as a secondary index on top of Redis that eliminates internal data structures. This helps to increase responsiveness as well as makes it easier for developers to harness advanced tasks such as multifield queries, aggregation and full-text search capabilities like exact phrase matching and numeric filtering for text queries.
“You have the aggregation pipeline that lets you transform the data that you have in your JSON document,” Keller said.