How to sent metadata to the LLM together with docs and context?

499 Views Asked by At

How can I include document's metadata from vector db (Qdrant in my case), and send them together with context to the LLM (OpenAI in my case). For example I store anonymized resumes in my vector db and each one has metadata with source (name of the file), how can I include it, and ask OpenAI to answer me with this metadata? Example payload (small part of page content preserved):

{"metadata":{"page":0,"source":"C:\Users\username\Desktop\resumes_dataset\50328713.pdf"},"page_content":"Predicted Performance of Multilayered \nMetal-Mesh Screens. SPE Drilling & Completion. SPE-178955-PA. https://blablabla.org/10.2118/178955-PA."}

Example call to the LLM:

    def run_llm(query: str):
    chat = ChatOpenAI(verbose=True, temperature=0)
    qa = RetrievalQA.from_chain_type(llm=chat, chain_type="stuff", retriever=qdrant.as_retriever(),
                                     return_source_documents=True)
    return qa({"query": query})


print(run_llm(query="Which unique resume has the longest working experience as accountant?"))

It does a great job finding the correct answer, but I can't make it to answer me with document name, only by including return_source_documents=True it shows me where it found an answer.

0

There are 0 best solutions below