Streaming example for the updated OpenAI API (#7508)

* Changes for updated OpenAI api

* Update guides/04_chatbots/01_creating-a-chatbot-fast.md

---------

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>
This commit is contained in:
Akshunn Trivedi 2024-02-23 00:08:29 +08:00 committed by GitHub
parent 33f68cb6c2
commit 8f050eedbc
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -198,10 +198,11 @@ gr.ChatInterface(predict).launch()
Of course, we could also use the `openai` library directy. Here a similar example, but this time with streaming results as well:
```python
import openai
from openai import OpenAI
import gradio as gr
openai.api_key = "sk-..." # Replace with your key
api_key = "sk-..." # Replace with your key
client = OpenAI(api_key=api_key)
def predict(message, history):
history_openai_format = []
@ -210,17 +211,15 @@ def predict(message, history):
history_openai_format.append({"role": "assistant", "content":assistant})
history_openai_format.append({"role": "user", "content": message})
response = openai.ChatCompletion.create(
model='gpt-3.5-turbo',
response = client.chat.completions.create(model='gpt-3.5-turbo',
messages= history_openai_format,
temperature=1.0,
stream=True
)
stream=True)
partial_message = ""
for chunk in response:
if len(chunk['choices'][0]['delta']) != 0:
partial_message = partial_message + chunk['choices'][0]['delta']['content']
if chunk.choices[0].delta.content is not None:
partial_message = partial_message + chunk.choices[0].delta.content
yield partial_message
gr.ChatInterface(predict).launch()