I am banging my head against the wall with something which should be so very simple! Just to call the openAI API and return the stream from the fastify server.
I have seen the docs and streams are looking super simple:
fastify.get('/streams', async function (request, reply) {
const fs = require('node:fs')
const stream = fs.createReadStream('some-file', 'utf8')
reply.header('Content-Type', 'application/octet-stream')
return reply.send(stream)
})
However when trying to hook this up to the openAI call:
async function main() {
const stream = await openai.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
stream: true,
});
for await (const part of stream) {
process.stdout.write(part.choices[0]?.delta?.content || '');
}
}
I just cannot get it to work....
I have tried:
- passing the stream directly
- passing the stream in an async loop
- using the raw HTTP request in fastify
- converting to readableStream
Nothing seems to work
Here is my latest attempt:
app.get<{ Querystring: GetSearchParams }>('/stream', async (req, reply) => {
try {
const { q } = req.query;
const stream = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: q }],
stream: true,
});
reply.raw.writeHead(200, { 'Content-Type': 'text/plain' });
for await (const part of stream) {
console.log(part.choices[0]?.delta?.content || '');
reply.raw.write(part.choices[0]?.delta?.content || '');
}
reply.raw.end();
} catch (err) {
reply.raw.writeHead(500, { 'Content-Type': 'text/plain' });
reply.raw.end('Ooops');
}
});
It works: