The relational database is still the world’s most used category of database, but its popularity trend line has been flat for a number of years now. Meanwhile other, more cloud native, types of database systems are trending upwards. It’s particularly noticeable with NoSQL database systems of varying flavors, like document stores (MongoDB), key-value (Redis), and wide column (Cassandra).

The following graph from DB-Engines shows this trend clearly:

Part of the reason NoSQL database systems are growing so fast is their compatibility with so-called “big data,” such as Artificial Intelligence (AI) and Machine Learning (ML). Because NoSQL databases are less structured than relational databases, they are more flexible and can scale better — both important attributes for the mass of data that needs to be processed, often in real-time, by AI and ML systems.

To get a better understanding of why NoSQL is so popular now and how these systems deal with AI and ML, I spoke to a founder of one of the most popular NoSQL database companies, Redis Labs. Yiftach Shoolman has been the company’s CTO since 2011 and recently oversaw the launch of a new Redis Labs module, RedisAI.

Read More:   Update Mnemonic: Memory Management for Big Data

“I think today, people are more open to look at new data models,” Shoolman told me. He says there are two main motivations for businesses to look at alternatives to relational databases. The first is scalability, and the second is when the data models required by an application are not natively supported by relational databases. An example of the latter is real-time recommendations on e-commerce sites, which requires a database that can quickly connect a lot of disparate data together (one option for this is a Graph DBMS product, like Neo4j).

Redis is a key-value database system, meaning it stores pairs of keys and values, but like Neo4j it too is optimized for the real-time web. In fact, it was invented for the real-time web. When Salvatore Sanfilippo began developing Redis back in 2009, it was because he was working on a real-time analytics engine at the time and was struggling to make it scale with a relational database. So he did what any clever developer would do: he built his own solution. That turned into the open source Redis, now the eighth most popular database in the world (and number one in the key-value category).

Redis has an in-memory data structure, meaning it stores data using random access memory (RAM). This makes it much faster than traditional databases; and ideal to use as a caching system. This is what differentiates Redis from both relational databases and its NoSQL competitors (such as MongoDB and Cassandra). According to Shoolman, because of its in-memory structure Redis is “very, very optimized to real-time applications [and has] sub-millisecond latency.”

One of the downsides to in-memory, particularly for AI, is that it simply doesn’t have the same capacity as other database systems. Given that AI thrives on as much data as you can feed it, this could be a big issue. It’s also not as easy to query the data, which is a strength of relational databases like Oracle and MySQL (and up and coming cloud native SQL databases, like CockroachDB).

Read More:   Please Serverless, Have an Opinion – InApps 2022

While Shoolman acknowledged that Redis “is still in many cases used as a caching system,” he said the goal of Redis Labs (the company) is to help enterprises “use it as more than a cache.” The company is primarily doing this via Redis Enterprise, its commercial platform built on top of the open source Redis database.

The way in which the core functionality of Redis is being extended into a full “database platform” by Redis Labs is via modules, of which RedisAI is one. Developed in tandem with AI software specialists Tenserwerk, RedisAI enables you to run an AI inference engine across your Redis database. It’s too complicated to get into here, but Shoolman wrote an informative blog post about it if you want the nitty-gritty.

A key point Shoolman told me in our call is that the RedisAI module does not train your data — for that you need a tool like TensorFlow or PyTorch (two open source platforms for machine learning). Where RedisAI comes in is at the application layer, when it’s time to apply logic to the data (inferencing) and then serve it to the user.

“We believe that you need to train your [AI] model somewhere in the cloud,” Shoolman explained. “Once you want to do the serving or the inferencing, Redis is the right database to do that.”

Shoolman gave the example of transaction scoring on a credit card transaction. Ultimately the system must decide whether the transaction is a pass or fail. “In order to do that,” Shoolman said, “you need to add the user profile, the merchant profile, education profile, the active transaction data itself, [then] vectorize everything and send it to the AI.”

RedisAI is kind of a conduit between the AI software and the data. And speed is the name of the game, at least where Redis is concerned. The accuracy part is critical too, of course, but presumably that’s the responsibility of the AI training platform.

Read More:   Update Prometheus Gets its Own Day at KubeCon, a Testament to its Rise in the Ecosystem

According to Shoolman, one of the main challenges for enterprises going forward is whether they want to stick with the traditional siloed approach (of relational databases), or whether they will adopt a multi-model approach that supports multiple data models against a single backend. Redis Enterprise is an example of the latter, with each of its modules enabling a different data model.

Shoolman says it depends on what you want to achieve. There are some things relational databases just can’t do compared to the multimodel approach.

“If you think about transactions that need to combine multiple data models,” Shoolman said, “and still need to do the sub-millisecond latency, there is no way to do it in the siloed approach.”

In upcoming columns, I will look at how other database companies are adapting to the age of AI — including relational database vendors.

Redis Labs, MongoDB and Cockroach Labs are sponsors of InApps.

Feature image via Pixabay.