BufferMemory issue with ConversationalRetrievalQAChain in JavaScript/Nodejs

698 Views Asked by At

I am using Langchain in Nodejs and following the official documentation to save the conversation context using ConversationalRetrievalQAChain and BufferMemory, and not able to pass the memory object to ConversationalRetrievalQAChain.fromLLM as per documentation here: https://js.langchain.com/docs/modules/chains/popular/chat_vector_db/.

  const encoder = new TextEncoder();
    const stream = new TransformStream();
    const writer = stream.writable.getWriter();
    console.log("Querying Pinecone vector store...");
    const index = client.Index(indexName);
    const queryEmbedding = await new OpenAIEmbeddings().embedQuery(question);

    let queryResponse = await index.query({
      queryRequest: {
        topK: 10,
        vector: queryEmbedding,
        includeMetadata: true,
        includeValues: true,
      },
    });

    if (queryResponse.matches.length) {
      const model = new OpenAI({
        modelName: "gpt-3.5-turbo",
        streaming: true,
        temperature: 0,
        openAIApiKey: process.env.OPENAI_API_KEY || "",
        callbacks: [
          {
            async handleLLMNewToken(token) {
              await writer.ready;
              await writer.write(encoder.encode(`${token}`));
            },
            async handleLLMEnd() {
              await writer.ready;
              await writer.close();
            },
          },
        ],
      });
      const concatenatedPageContent = queryResponse.matches
        .map((match) => match.metadata.pageContent)
        .join(" ");
      const vectorStore = await PineconeStore.fromDocuments(
        [new Document({ pageContent: concatenatedPageContent })],
        new OpenAIEmbeddings(),
        {
          pineconeIndex: index,
        }
      );
      const chain = ConversationalRetrievalQAChain.fromLLM(
        model,
        vectorStore.asRetriever(),
        {
          memory: new BufferMemory({
            memoryKey: "chat_history", // Must be set to "chat_history"
          }),
        }
      );

      /* Ask it a question */
      const question = "What did the president say about Justice Breyer?";
      const res = await chain.call({ question });
      console.log(res);
      /* Ask it a follow up question */
      const followUpRes = await chain.call({
        question: "Was that nice?",
      });
      console.log(followUpRes);

Error: Argument of type '{ memory: BufferMemory; }' is not assignable to parameter of type '{ outputKey?: string | undefined; returnSourceDocuments?: boolean | undefined; questionGeneratorTemplate?: string | undefined; qaTemplate?: string | undefined; } & Omit<ConversationalRetrievalQAChainInput, "retriever" | ... 1 more ... | "questionGeneratorChain">'. Object literal may only specify known properties, and 'memory' does not exist in type '{ outputKey?: string | undefined; returnSourceDocuments?: boolean | undefined; questionGeneratorTemplate?: string | undefined; qaTemplate?: string | undefined; } & Omit<ConversationalRetrievalQAChainInput, "retriever" | ... 1 more ... | "questionGeneratorChain">'.ts(2345)

Goal: My goal is to save the conversation context over the pdf document.

I am open to other suggestions around my use case. There is very less support for the Langchain with Nodejs.

2

There are 2 best solutions below

0
On

I had "langchain": "^0.0.84", and I was getting the same error

update the package to the latest version, currently "langchain": "^0.0.127" and it works

0
On

had the same, or at least a very similar problem even with newer langchain version.

error was

/langchain/dist/chains/conversational_retrieval_chain.js:116
            throw new Error(`Question key ${this.inputKey} not found.`);

my problem was solved by adding the literal question as key for the memory.

suggest changing line FROM

const res = await chain.call({ question });

TO

const res = await chain.call({ question: question });

hope it helps.