How to configure `gr.ChatInterface` to return multiple outputs (response & source documents)?

61 Views Asked by At

I have this gr.ChatInterface that I want to adjust to also show to the user document sources that were used on retrieval (meaning, adding another output)

import gradio as gr

def generate_response(message, history):   
    print(f"\n\n[message] {message}")
    # call LLM & generate response
    return response.answer


demo = gr.ChatInterface(
    fn=generate_response,
    title="RAG app for Q&A",
    description="Ask any question about Stuff",
).queue(default_concurrency_limit=2, max_size=10)

demo.launch(share=True)
    

I already tried outputs but it's not supported by gr.ChatInterface:

Traceback (most recent call last):
  File "/workspaces/aider_repos/app.py", line 20, in <module>
    demo = gr.ChatInterface(
           ^^^^^^^^^^^^^^^^^
TypeError: ChatInterface.__init__() got an unexpected keyword argument 'outputs'

How to configure gr.ChatInterface to return multiple outputs (response & source documents)?

Traceback (most recent call last):
  File "/workspaces/aider_repos/app.py", line 20, in <module>
    demo = gr.ChatInterface(
           ^^^^^^^^^^^^^^^^^
TypeError: ChatInterface.__init__() got an unexpected keyword argument 'outputs'
1

There are 1 best solutions below

0
Umut On

The gr.ChatInterface in Gradio does not support multiple outputs directly out of the box. However, we can achieve this by creating a custom component that combines the response and source documents into a single output string.

Here's how we can modify the code to return both the response and source documents

import gradio as gr

def generate_response(message, history):
print(f"\n\n[message] {message}")
# Call LLM & generate response 
response_text = "This is the generated response."
source_documents = ["Source document 1", "Source document 2"]
return response_text, source_documents

def chatbot(input_message, history):
response, sources = generate_response(input_message, history)
combined_output = f"Response: {response}\n\nSource documents:\n\n" + "\n".join(sources)
history = history + [(input_message, combined_output)]
return history, history

demo = gr.ChatInterface(
fn=chatbot,
title="RAG app for Q&A", 
description="Ask any question about Stuff",
).queue(default_concurrency_limit=2, max_size=10)

demo.launch(share=True)