Streaming OpenAI Chat Completions Using React and Node JS
Vložit
- čas přidán 29. 05. 2023
- In this tutorial we go over the code for an endpoint that allows us to stream back a response from the OpenAI chat completion API. We use a Node JS server and a React JS frontend.
Links
Blog post with code: www.typeblock.co/blog/openai-...
Twitter: / t0xsh
#chatgpt #openai #reactjs - Věda a technologie
thanks, bro, I thought this was something complex and only Vercel could do this. Now I have also created my own streaming solution
Worked like magic. Thanks bro. Props to not over explaining and wasting time!
Glad you found it helpful although the method described in this video might be a bit outdated as it was with the previous version of the OpenAI Node JS library. Let me know if you're interested in an updated video. Happy to record one.
@@toshvelaga It would be good to make a video about assistant api with new streaming feature
Yes, please, would appreciate an update to stream with gpt-4-turbo-preview with integration of node & react!@@toshvelaga
subscribed, thank you
How do I use react to make a chat like feature and stream in unique responses from the server, not public to anybody that fetches from the endpoint
ty tosh
Very usefull thank you !
i just copy / paste your code and it work on the first time, i think it's just a miracle hahaha
thanks!
subscribed. Thank you. Which package did u install to import ,configuration and openai API from "openAI" ? i tried the same but unable to import
Hey when I recorded this video it was with an older version of the openai package. I've since upgraded and it makes streaming a lot easier. If I were you I would also use the latest version of the node package. You can stream on the backend with something that looks similar to this, while the frontend code remains the same:
const stream = await openai.chat.completions.create(postData)
let final_response = ''
for await (const chunk of stream) {
console.log(chunk.choices[0].delta?.content)
res.write(chunk.choices[0]?.delta?.content || '')
final_response += chunk.choices[0]?.delta?.content || ''
}
res.end()
}
@@toshvelaga Thanks for the quick response. I tried your code above u mentioned but its giving error TypeError: res.write is not a function. I am using nextJs creating post api to chat with openai
Thank you
My pleasure :)
thanks! will this solution work with a backend in node/express, a frontend in react? (no next.js)
Yes absolutely, you can think of Next js as being a wrapper built on top of React so it will work :)
How do you handle special tokens returned by openai for formatting in your react component?
Its Markdown just use the react Markdown Wrapper on your response, you can easily custmize it to display what you need
I got ERROR: TypeError: Cannot read properties of undefined (reading 'on') when i return the response i dont get a .data object i jsut get this:
Stream {
iterator: [AsyncGeneratorFunction: iterator],
controller: AbortController { signal: AbortSignal { aborted: false } }
}
Hey Tosh, for some reason this code doesn't work for Azure open AI api. Can you tell me why?
Where do you host the server code? I normally use Firebase functions, but it doesn't support streaming.
I host the server on railway. It's super easy to set up. I've got a bunch of servers set up there and it's practically free
hey great video, can we do this directly on frontend without using NODE js?
hey you will need a server to hit the openAI endpoint that being said you can use any language you want on the server like python or php or node js. The reason you need a server is because you don't want to expose your OpenAI key on the frontend, where someone could easily use it and run up your bill.
@@toshvelaga assuming we had another way of hiding the key, how could we do this directly in React without using a backend?