Langchain is not storing the first question that I ask

180 Views Asked by At

I dont know why but why but langchain ConversationalRetrievalChain is not remembering the first question that I ask.

from langchain.prompts import SystemMessagePromptTemplate, HumanMessagePromptTemplate, ChatPromptTemplate
# Define the system message template
system_template = """you are an AI assistant named aidan. your job is to recommend the best clinicians based 
on the symptoms and locations. ask the user about location preference and then recommend the clinicians based on their symptoms and preferences
        ----------------
        Context
        {context}


        Chat History
        {chat_history}
        
        QUESTION:
        {question}
        
        ANSWER:

        
        """

        # Create the chat prompt templates
messages = [
        SystemMessagePromptTemplate.from_template(system_template),
        HumanMessagePromptTemplate.from_template("{question}")
        ]
qa_prompt = ChatPromptTemplate.from_messages(messages)



chain = ConversationalRetrievalChain.from_llm(
llm = ChatOpenAI(temperature=1,model_name='gpt-3.5-turbo'),combine_docs_chain_kwargs={"prompt": qa_prompt}, return_source_documents=True,
retriever=vectorstore.as_retriever())


chat_history=[]
result = chain({"question": """ I have Concussion.
""", "chat_history":chat_history})
print(result)



chat_history=[("I have concussion",result['answer'])]
result = chain({"question": """ cordova bay.
""", "chat_history":chat_history})
print(result)


when I execute the chain "I have concussion" the answer I get is good. but when I enter the cordova bay(location) it forgets that I have already entered the symptoms.

1

There are 1 best solutions below

2
On

The class ConversationalRetrievalChain use the chat history and the new question to create a "standalone question". If you want to chat with gpt, you can use ConversationBufferMemory

Just like below:

from langchain.memory import ConversationBufferMemory


memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
chain = ConversationalRetrievalChain.from_llm(
    llm=ChatOpenAI(temperature=1, model_name="gpt-3.5-turbo"),
    combine_docs_chain_kwargs={"prompt": qa_prompt},
    return_source_documents=True,
    retriever=vectorstore.as_retriever(),
)

chat_history = []
result = chain(
    {
        "question": """ I am sunny.
""",
        "chat_history": chat_history,
    }
)
print(result)

# Output
# {'question': ' I am sunny.\n', 'chat_history': [], 'answer': 'Hello 
# Sunny! How can I assist you today?', 'source_documents': 
# [Document(page_content='')]}

chat_history = [("I am sunny", result["answer"])]
chat_history


result = chain(
    {
        "question": """ who am I.
""",
        "chat_history": chat_history,
    }
)
print(result)


# Output
# {'question': ' who am I.\n', 'chat_history': [('I am sunny', 'Hello 
# Sunny! How can I assist you today?')], 'answer': 'You are Sunny.', 
# 'source_documents': [Document(page_content='')]}