Vercel AI SDK: API resolved without sending a response for /api/chat, this may result in stalled requests

400 Views Asked by At

After updating OpenAI to 4.0+, I am unable to get any responses with the Vercel AI SDK.

I suspect it has something to do with the runtime being nodejs, but when changing to edge, I get the weird error of:

error - Error: The edge runtime does not support Node.js 'buffer' module.

which seems to come from the OpenAIStream. For other reasons, I am not able to use the edge runtime anyway, so it doesn't matter so much.

Here is a very basic example of what does not work (nodejs runtime):


import { OpenAIStream, StreamingTextResponse } from 'ai'
import OpenAI from 'openai'

export const runtime = 'nodejs'

const openai = new OpenAI({
    apiKey: process.env.OPENAI_TOKEN
});

export default async function POST(req) {
    const { messages } = req.body  //Nodejs runtime

    //Add a system message to the start of the messages array
    messages.unshift({
        role: 'system',
        content: 'You are a chatbot...'
    })

    const res = await openai.chat.completions.create({
        model: 'gpt-3.5-turbo',
        messages,
        stream: true,
    })

    const stream = OpenAIStream(res);

    return new StreamingTextResponse(stream)
}

It gives no errors, but logs in the console

API resolved without sending a response for /api/chat, this may result in stalled requests.

I have made sure that I use OpenAI npm version 4 (shows "openai": "^4.0.0" in package.json). I know the API key is valid

0

There are 0 best solutions below