I dont know why but why but langchain ConversationalRetrievalChain is not remembering the first question that I ask.
from langchain.prompts import SystemMessagePromptTemplate, HumanMessagePromptTemplate, ChatPromptTemplate
# Define the system message template
system_template = """you are an AI assistant named aidan. your job is to recommend the best clinicians based
on the symptoms and locations. ask the user about location preference and then recommend the clinicians based on their symptoms and preferences
----------------
Context
{context}
Chat History
{chat_history}
QUESTION:
{question}
ANSWER:
"""
# Create the chat prompt templates
messages = [
SystemMessagePromptTemplate.from_template(system_template),
HumanMessagePromptTemplate.from_template("{question}")
]
qa_prompt = ChatPromptTemplate.from_messages(messages)
chain = ConversationalRetrievalChain.from_llm(
llm = ChatOpenAI(temperature=1,model_name='gpt-3.5-turbo'),combine_docs_chain_kwargs={"prompt": qa_prompt}, return_source_documents=True,
retriever=vectorstore.as_retriever())
chat_history=[]
result = chain({"question": """ I have Concussion.
""", "chat_history":chat_history})
print(result)
chat_history=[("I have concussion",result['answer'])]
result = chain({"question": """ cordova bay.
""", "chat_history":chat_history})
print(result)
when I execute the chain "I have concussion" the answer I get is good. but when I enter the cordova bay(location) it forgets that I have already entered the symptoms.
The class
ConversationalRetrievalChain
use the chat history and the new question to create a "standalone question". If you want to chat with gpt, you can useConversationBufferMemory
Just like below: