Abubakar Abid 368ba73106
Update Chat Interface examples and add more LLM libraries and API providers (#10025)
* docs

* guide

* langchain

* changes

* reduce

* llama index example

* demo

* demo

* reduce

* add comment

* fix langchain example

* reduce

* guide

* openai demo

* changes

* sambanova

* hyperbolic

* push

* guide

* add chatbot demos to playground

* add changeset

* claude

* restore

* fix data model

* add changeset

* claude demo

* more changes

* Update guides/05_chatbots/02_chat_interface_examples.md

Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com>

* Update guides/05_chatbots/02_chat_interface_examples.md

Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com>

* Update guides/05_chatbots/02_chat_interface_examples.md

Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com>

* Update guides/05_chatbots/02_chat_interface_examples.md

Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com>

* Update guides/05_chatbots/02_chat_interface_examples.md

Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com>

* Update guides/05_chatbots/02_chat_interface_examples.md

Co-authored-by: Ali Abdalla <ali.si3luwa@gmail.com>

* more examples

* reduce

* changes

* notebooks

* lint

* Fixes creating chatbot fast guide and website build issues   (#10060)

* fixes

* formatting

* examples

---------

Co-authored-by: aliabd <ali.si3luwa@gmail.com>
Co-authored-by: gradio-pr-bot <gradio-pr-bot@users.noreply.github.com>
2024-11-28 14:07:39 -05:00

1.5 KiB

Gradio Demo: llm_sambanova

In [ ]:
!pip install -q gradio openai>=1.0.0 
In [ ]:
# This is a simple general-purpose chatbot built on top of SambaNova API. 
# Before running this, make sure you have exported your SambaNova API key as an environment variable:
# export SAMBANOVA_API_KEY="your-sambanova-api-key"

import os
import gradio as gr
from openai import OpenAI

api_key = os.getenv("SAMBANOVA_API_KEY")

client = OpenAI(
    base_url="https://api.sambanova.ai/v1/",
    api_key=api_key,
)

def predict(message, history):
    history.append({"role": "user", "content": message})
    stream = client.chat.completions.create(messages=history, model="Meta-Llama-3.1-70B-Instruct-8k", stream=True)
    chunks = []
    for chunk in stream:
        chunks.append(chunk.choices[0].delta.content or "")
        yield "".join(chunks)

demo = gr.ChatInterface(predict, type="messages")

if __name__ == "__main__":
    demo.launch()