This repository contains a reusable function leveraging async generators to stream real-time responses from OpenAI's ChatGPT API, word by word, in a Node.js environment. Instead of waiting for the entire response to be generated, this function allows for an interactive experience where the AI's thoughts are streamed as they are produced.
- Node.js 18.14 or higher (since we're using nodejs native fetch)
- A valid API key for OpenAI's ChatGPT
First, make sure to set the CHAT_GPT_API_KEY
constant in the script to your ChatGPT API key.
Here's a quick example of how to use the function:
import { streamChatgptApi } from "./streamChatgptApi.js";
for await (const responsePart of streamChatgptApi("Hello AI, I am a human.")) {
if (responsePart.finish_reason) {
console.log("finished execution with reason:", responsePart.finish_reason);
} else {
process.stdout.write(responsePart.delta.content);
}
}
The function streamChatgptApi
makes a POST request to ChatGPT with the stream
parameter set to true
. It then reads the incoming stream and yields the message parts as they arrive.
This makes it possible to process each chunk of the response in real-time, as illustrated in the usage example above.
For a deep dive into the intricacies of this approach, including how server-sent events work, and the significance of streaming via POST, check out my detailed blog post:
Stream Real-time Feedback with ChatGPT: SSE via Fetch in Node.js