How to submit user prompt manually using Vercel AI SDK

1.2k Views Asked by At

I want to submit a JSON string of a transcript from localStorage to Chat-GPT for it to process.

However, I do not know how to use the Vercel AI SDK to submit a prompt that is not from a form.

app/grading/page.tsx

"use client"
import React, { useEffect, useState } from 'react'
import { useRouter } from 'next/navigation'
import toast, { Toaster } from 'react-hot-toast'


export default function Grading() {

  const router = useRouter();
  const rawData = localStorage.getItem('transcript')
  const [responseData, setResponseData] = useState<string | undefined>(undefined);
  
  useEffect(() => {
    
    if (!rawData) {
      router.push('/error/no-data-available')
      return
    }

    const fetchData = async () => {
      try {
        const response = await fetch('/api/grade', {
          method: 'POST',
          headers: {
            'Content-Type': 'application/json',
          },
          body: `{"messages": ${rawData}}`
        });

        if (response.ok) {
          console.log(response)
          const data = await response.json();
          setResponseData(data);
        } else {
          console.log(response)
        }

      } catch (error) {
        console.log('An error occured:', error)
      }
    };

    fetchData();
    
  }, [])


  return (
    <>
      <Toaster/>
      <h1>Grading your conversation...</h1>
      
      {responseData ? (
        <div>
          {responseData}

        </div>
      ): (
        <div>Loading...</div>
      )}
      
    
    </>
  )
}


api/grade/route.ts

// ./app/api/chat/route.ts
import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';
import { prompts } from '../../../static/prompts'
const { grading_prompt } = prompts

// Create an OpenAI API client (that's edge friendly!)
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY || '',
});

// IMPORTANT! Set the runtime to edge
export const runtime = 'edge';

export async function POST(req: Request) {
  
  // Extract the `prompt` from the body of the request
  const { messages } = await req.json();
  
  const messageWithSystem = [
    {role: 'system', content: grading_prompt},
    ...messages // Add user and assistant messages after the system message
  ]

  // Ask OpenAI for a streaming chat completion given the prompt
  const response = await openai.chat.completions.create({
    model: 'gpt-3.5-turbo',
    stream: true,
    messages: messageWithSystem,
  });

  // Convert the response into a friendly text-stream
  const stream = OpenAIStream(response);
  // Respond with the stream
  return new StreamingTextResponse(stream);
}


I have tried making use of the useChat() hook, but could not execute handleSubmit() manually.

I have also tried making a Post request like I did in the code above in page.tsx.

Of course I could also use the regular OpenAI api, but I was wondering if it was possible with Vercel AI SDK.

Any help will be appreciated, thanks.

2

There are 2 best solutions below

0
On BEST ANSWER

You can use the useCompletion hook to handle submitting the user prompt manually for text completion:

"use client"
import React, { useEffect, useState } from 'react'
import { useRouter } from 'next/navigation'
import toast, { Toaster } from 'react-hot-toast'
import { useCompletion } from "ai/react";

export default function Grading() {
  const router = useRouter();
  const rawData = localStorage.getItem('transcript')

  const { complete, completion, isLoading } = useCompletion({
    api: "/api/grade",
    onResponse: (res) => {
      // trigger something when the response starts streaming in
      // e.g. if the user is rate limited, you can show a toast
      if (res.status === 429) {
        toast.error("You are being rate limited. Please try again later.");
      }
    },
    onFinish: () => {
      // do something with the completion result
      toast.success("Successfully generated completion!");
    },
  });
  
  useEffect(() => {
    if (!rawData) {
      router.push('/error/no-data-available')
      return
    }

    complete(rawData);
  }, [])


  return (
    <>
      <Toaster />
      <h1>Grading your conversation...</h1>
      <p>Current state: {isLoading ? "Generating..." : "Idle"}</p>
      {completion ? <div>{completion}</div> : <div>Loading...</div>}
    </>
  );
}

You still use a useEffect for handling the text completion, instead using the callback provided by the useCompletion hook (complete(rawData)) instead of writing your own fetch logic. The result is kept in the completion output from the hook.

0
On

May be a little late, but I found that the append() ChatHelper allows what you describe with the useChat utility.

Scroll to ChatHelpers

It requires that you pass a Message | CreateMessage as the first arg, and conditionally ChatRequestOptions.

You can build a simple message like so:

append({
  role: "user",
  content: "Message you want to send",
});