mirror of
https://github.com/gradio-app/gradio.git
synced 2025-04-18 12:50:30 +08:00
* docs * guide * langchain * changes * reduce * llama index example * demo * demo * reduce * add comment * fix langchain example * reduce * guide * openai demo * changes * sambanova * hyperbolic * push * guide * add chatbot demos to playground * add changeset * claude * restore * fix data model * add changeset * claude demo * more changes * Update guides/05_chatbots/02_chat_interface_examples.md Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com> * Update guides/05_chatbots/02_chat_interface_examples.md Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com> * Update guides/05_chatbots/02_chat_interface_examples.md Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com> * Update guides/05_chatbots/02_chat_interface_examples.md Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com> * Update guides/05_chatbots/02_chat_interface_examples.md Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com> * Update guides/05_chatbots/02_chat_interface_examples.md Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com> * more examples * reduce * changes * notebooks * lint * Fixes creating chatbot fast guide and website build issues (#10060) * fixes * formatting * examples --------- Co-authored-by: aliabd <ali.si3luwa@gmail.com> Co-authored-by: gradio-pr-bot <gradio-pr-bot@users.noreply.github.com>
1.5 KiB
1.5 KiB
Gradio Demo: llm_sambanova¶
In [ ]:
!pip install -q gradio openai>=1.0.0
In [ ]:
# This is a simple general-purpose chatbot built on top of SambaNova API. # Before running this, make sure you have exported your SambaNova API key as an environment variable: # export SAMBANOVA_API_KEY="your-sambanova-api-key" import os import gradio as gr from openai import OpenAI api_key = os.getenv("SAMBANOVA_API_KEY") client = OpenAI( base_url="https://api.sambanova.ai/v1/", api_key=api_key, ) def predict(message, history): history.append({"role": "user", "content": message}) stream = client.chat.completions.create(messages=history, model="Meta-Llama-3.1-70B-Instruct-8k", stream=True) chunks = [] for chunk in stream: chunks.append(chunk.choices[0].delta.content or "") yield "".join(chunks) demo = gr.ChatInterface(predict, type="messages") if __name__ == "__main__": demo.launch()