Skip to main content

Redis

Redis是一款快速的开源内存数据存储系统。,作为Redis Stack的一部分,RediSearch是一种支持向量相似性语义搜索以及其他许多类型搜索的模块。

兼容性

只支持在Node.js上使用。

LangChain.js接受node-redis作为Redis矢量存储的客户端。

设置

  1. 根据文档在计算机上使用Docker运行Redis。
  2. 安装node-redis JS客户端

npm install -S redis

索引文档

import { createClient, createCluster } from "redis";
import { Document } from "langchain/document";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { RedisVectorStore } from "langchain/vectorstores/redis";

const client = createClient({
url: process.env.REDIS_URL ?? "redis://localhost:6379",
});
await client.connect();

const docs = [
new Document({
metadata: { foo: "bar" },
pageContent: "redis is fast",
}),
new Document({
metadata: { foo: "bar" },
pageContent: "the quick brown fox jumped over the lazy dog",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "lorem ipsum dolor sit amet",
}),
new Document({
metadata: { baz: "qux" },
pageContent: "consectetur adipiscing elit",
}),
];

const vectorStore = await RedisVectorStore.fromDocuments(
docs,
new OpenAIEmbeddings(),
{
redisClient: client,
indexName: "docs",
}
);

await client.disconnect();

查询文档

import { createClient } from "redis";
import { OpenAI } from "langchain/llms/openai";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { RetrievalQAChain } from "langchain/chains";
import { RedisVectorStore } from "langchain/vectorstores/redis";

const client = createClient({
url: process.env.REDIS_URL ?? "redis://localhost:6379",
});
await client.connect();

const vectorStore = new RedisVectorStore(new OpenAIEmbeddings(), {
redisClient: client,
indexName: "docs",
});

/* Simple standalone search in the vector DB */
const simpleRes = await vectorStore.similaritySearch("redis", 1);
console.log(simpleRes);
/*
[
Document {
pageContent: "redis is fast",
metadata: { foo: "bar" }
}
]
*/

/* Search in the vector DB using filters */
const filterRes = await vectorStore.similaritySearch("redis", 3, ["qux"]);
console.log(filterRes);
/*
[
Document {
pageContent: "consectetur adipiscing elit",
metadata: { baz: "qux" },
},
Document {
pageContent: "lorem ipsum dolor sit amet",
metadata: { baz: "qux" },
}
]
*/

/* Usage as part of a chain */
const model = new OpenAI();
const chain = RetrievalQAChain.fromLLM(model, vectorStore.asRetriever(1), {
returnSourceDocuments: true,
});
const chainRes = await chain.call({ query: "What did the fox do?" });
console.log(chainRes);
/*
{
text: " The fox jumped over the lazy dog.",
sourceDocuments: [
Document {
pageContent: "the quick brown fox jumped over the lazy dog",
metadata: [Object]
}
]
}
*/

await client.disconnect();