Langchain - Chat History with Embedded Data not working

484 Views Asked by At

I'm trying to embed some data to gpt-4-1106-preview and chat on it. You may think like chatPDF. But my problem is, ChatGPT answers the question but doesn't remember our chat history.

Here is the code, probably I'm doing a mistake but I couldn't found.

Console command

!pip install langchain openai cohere tiktoken kaleido python-multipart fastapi uvicorn chromadb

The code

import os
import sys
from langchain.document_loaders import TextLoader
from langchain.indexes import VectorstoreIndexCreator
from langchain.chat_models import ChatOpenAI
from langchain.memory import ConversationBufferMemory

os.environ["OPENAI_API_KEY"] = "sk-XYZ"

oader = TextLoader("./data.txt")
index = VectorstoreIndexCreator().from_loaders([loader])
docs = loader.load();

print(docs)

memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
query = "My name is Fobus"
print(index.query(query, llm=ChatOpenAI(model="gpt-4-1106-preview", temperature=0.7), verbose=True, memory=memory))

query = "What is my name?"
print(index.query(query, llm=ChatOpenAI(model="gpt-4-1106-preview", temperature=0.7), verbose=True, memory=memory))

print(memory)

and the output

[Document(page_content='data.txt content comes here', metadata={'source': './data.txt'})]


> Entering new RetrievalQA chain...

> Finished chain.
Hello Fobus, how can I help you today? If you have any questions or need assistance with something, feel free to ask.


> Entering new RetrievalQA chain...

> Finished chain.
I don't know your name. My capabilities don't include access to personal data unless it's shared with me in the course of our conversation, and even then, I don't retain that information. If you'd like to tell me your name, I can address you by it for the duration of our interaction.
chat_memory=ChatMessageHistory(messages=[HumanMessage(content='My name is Fobus'), AIMessage(content='Hello Fobus, how can I help you today? If you have any questions or need assistance with something, feel free to ask.'), HumanMessage(content='What is my name?'), AIMessage(content="I don't know your name. My capabilities don't include access to personal data unless it's shared with me in the course of our conversation, and even then, I don't retain that information. If you'd like to tell me your name, I can address you by it for the duration of our interaction.")]) return_messages=True memory_key='chat_history'

As you can see in the first message he calls me with my name, but in the second message it forgot.

We want more honest artificial intelligence lol...

1

There are 1 best solutions below

0
On

It seems you have created two artificial intelligence.

The code index.query(query, llm=ChatOpenAI(model="gpt-4-1106-preview", temperature=0.7), verbose=True, memory=memory)) with instantiate a chain every time you run it. So you asked two robot seperately.

Try this code:

from langchain.llms import OpenAI
from langchain.chains import ConversationChain


llm = OpenAI(temperature=0)

conversation = ConversationChain(
    llm=llm,
    verbose=True,
    memory=ConversationBufferMemory()
)

conversation.predict(input="My name is Fobus")

conversation.predict(input="What is my name?")