How to support OpenAI's Chat Completions API format in LlamaIndex?

1.3k Views Asked by At

I'm currently using LlamaIndex for a project, and I'm trying to find a way to support the complex prompt format used by OpenAI's Chat Completions API within the chat engine of LlamaIndex.

The OpenAI API uses a list of messages for its prompts, where each message has a role ('system', 'user', or 'assistant') and content (the text of the message). Here is an example:

{
  "model": "gpt-3.5-turbo",
  "messages": [{"role": "user", "content": "Hello!"}]
}

However, when I'm using the CondenseQuestionChatEngine.from_defaults function in LlamaIndex (as per the documentation here: https://gpt-index.readthedocs.io/en/latest/how_to/chat_engine/usage_pattern.html), it seems that the custom_prompt parameter doesn't support this context string format:

chat_engine = CondenseQuestionChatEngine.from_defaults(
    query_engine=query_engine, 
    condense_question_prompt=custom_prompt,
    chat_history=custom_chat_history,
    verbose=True
)

This limitation is affecting my ability to have more complex interactions with the model, especially for conversational AI applications.

Does anyone have experience with this issue, or can anyone provide some guidance on how to support the OpenAI's Chat Completions API format in LlamaIndex?

Any help would be greatly appreciated.

1

There are 1 best solutions below

0
On

For query engines and chat history you could add it to a Prompt class and a text_qa_template parameter until the feature is integrated in a chat engine