Redis’ RediSearch Secondary Index Responds Faster, Streamlines Indexing – InApps is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn Redis’ RediSearch Secondary Index Responds Faster, Streamlines Indexing – InApps in today’s post !

Read more about Redis’ RediSearch Secondary Index Responds Faster, Streamlines Indexing – InApps at Wikipedia



You can find content about Redis’ RediSearch Secondary Index Responds Faster, Streamlines Indexing – InApps from the Wikipedia website

With the launch of version 2.0, the RediSearch in-memory index for Redis is over twice as fast as its previous iteration and offers additional ways to makes it easier for developers to create and exploit indices.

Built by Redis Labs, RediSearch is a secondary index on top of Redis that eliminates internal data structures to increase responsiveness as well as make it easier for developers to harness advanced tasks such as multifield queries, aggregation and full-text search capabilities like exact phrase matching and numeric filtering for text queries.

In addition to a purported maximum 2.4-times increase in speed compared to RediSearch 1.6, RediSearch 2.0 supports Redis Labs’ Active-Active geo-distribution technology, is scalable without downtime and includes support for running on Flash drives, Redis on Flash (currently in private preview mode).

“The addition of RediSearch to your technology stack simplifies the data infrastructure, extends applications with rich search experiences, and unlocks the power of analytics within Redis,” Pieter Cailliau, director of product, Redis Labs, told InApps.

Redis Labs changed RediSearch’s architecture for RediSearch 2.0 to achieve two main goals, Cailliau said: to improve the developer experience and to enable RediSearch to support almost all Redis Enterprise capabilities.

“Creating an index is now easier than before — you just define the schema of the index on top of your existing data and you can start searching,” Cailliau said. “No need to move data around, no need to restart Redis, and no need to update your code.”

Read More:   Will 2022 Be a Golden Age for Full-Stack JavaScript? – InApps 2022

In furtherance of its goal to support as many Redis Enterprise capabilities as possible, RediSearch 2.0 now supports, for example, Redis on Flash, which Cailliau said: “is a more cost-effective way for running Redis.”

Always On

RediSearch can also be deployed in a “globally distributed manner by leveraging Redis Enterprise’s Active-Active technology to deliver five-nines (99.999%) availability across multiple geo-distributed replicas,” Cailliau said. This capability enables read operations, such as querying and aggregation, and write operations, such as indexing, that can be executed at the speed of local RediSearch deployments without worrying about conflict resolution, Cailliau said.

RediSearch also brings more sophisticated data modeling to Redis by providing several indexing strategies for the value part of the key, including full-text, geo-location, numbers and tags, Cailliau said.

“Without indexes, Redis must perform a SCAN operation for every query, which can be extremely slow and inefficient,” Cailliau said. “And creating and maintaining these indexes manually is complex and error-prone. RediSearch maintains these indexes for the user and allows you to query across data structures in a clustered database.”

Cailliau described RediSearch as an in-memory index built on top of an in-memory database. “Together, they not only provide low-latency response times for both writing and reading, but also synchronous operations,” Cailliau said. “These two features are mandatory for real-time applications, where you have the need to be able to update your data frequently and instantaneously.”

The Benchmarks

To support its claims of up to a 2.4x improvement in indexing speeds and ingestion performance for RediSearch 2.0, Cailliau described in a blog post how Redis Labs used its full-text search benchmark (FTSB) to test publicly available data about New York City taxis. This dataset included text, tag, geographic and numeric data, as well as “a large number of documents.”

Among the results, RediSearch was able to index 132K docs/sec at an overall p50 ingestion latency of 0.4ms, compared to RediSearch 1.6’s indexing-speed rate of 96K documents per second.

Read More:   35 Years Later, a Retro Computing Enthusiast Puts Windows 1 Back to Work – InApps 2022

“Not only will you benefit from the boost in the throughput, but each ingestion also becomes faster,”  Cailliau said. “Apart from the overall ingestion improvement due to the changes in architecture, you can now also rely on the open source Redis Cluster API capabilities to linearly scale the ingestion of your search database. “

Feature image via Pixabay.



Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      Success. Downloading...