Add Progress Bar component (#2750)

* changes

* version

* changes

* fixes

* changes

* changes

* changes

* changes

* chagnes

* chagnes

* fix

* changes

* changes

* changes

* change

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* version update

* Commit from GitHub Actions (Upload Python Package)

* changes

* changes

* changes

* fix

* changes

* changes

* changes

* Update CHANGELOG.md

* Update CHANGELOG.md

* Update CHANGELOG.md

* changes

* changes

* changes

* changes

* change

* changes

* Update guides/01_getting_started/02_key_features.md

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update gradio/helpers.py

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update gradio/routes.py

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update gradio/helpers.py

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update guides/01_getting_started/02_key_features.md

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update guides/01_getting_started/02_key_features.md

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update demo/progress_simple/run.py

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update demo/progress_simple/run.py

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update demo/progress_simple/run.py

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update website/homepage/src/docs/template.html

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* Update website/homepage/src/docs/template.html

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>

* changes

* changes

* changes

* changes

* changes

* changes

* changes

* change

* changes

* changes

* changes

* change

Co-authored-by: Abubakar Abid <abubakar@huggingface.co>
Co-authored-by: GH ACTIONS <aliabid94@users.noreply.github.com>
This commit is contained in:
aliabid94 2022-12-30 11:45:54 -08:00 committed by GitHub
parent d46f0cd1ed
commit 58b1a074ba
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
31 changed files with 1740 additions and 558 deletions

View File

@ -1,6 +1,24 @@
# Upcoming Release
## New Features:
* Send custom progress updates by adding a `gr.Progress` argument after the input arguments to any function. Example:
```python
def reverse(word, progress=gr.Progress()):
progress(0, desc="Starting")
time.sleep(1)
new_string = ""
for letter in progress.tqdm(word, desc="Reversing"):
time.sleep(0.25)
new_string = letter + new_string
return new_string
demo = gr.Interface(reverse, gr.Text(), gr.Text())
```
Progress indicator bar by [@aliabid94](https://github.com/aliabid94) in [PR 2750](https://github.com/gradio-app/gradio/pull/2750).
* Added `title` argument to `TabbedInterface` by @MohamedAliRashad in [#2888](https://github.com/gradio-app/gradio/pull/2888)
## Bug Fixes:

View File

@ -0,0 +1,2 @@
tqdm
datasets

1
demo/progress/run.ipynb Normal file
View File

@ -0,0 +1 @@
{"cells": [{"cell_type": "markdown", "id": 302934307671667531413257853548643485645, "metadata": {}, "source": ["# Gradio Demo: progress"]}, {"cell_type": "code", "execution_count": null, "id": 272996653310673477252411125948039410165, "metadata": {}, "outputs": [], "source": ["!pip install -q gradio tqdm datasets"]}, {"cell_type": "code", "execution_count": null, "id": 288918539441861185822528903084949547379, "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "import random\n", "import time\n", "import tqdm\n", "from datasets import load_dataset\n", "import shutil\n", "from uuid import uuid4\n", "\n", "with gr.Blocks() as demo:\n", " with gr.Row():\n", " text = gr.Textbox()\n", " textb = gr.Textbox()\n", " with gr.Row():\n", " load_set_btn = gr.Button(\"Load Set\")\n", " load_nested_set_btn = gr.Button(\"Load Nested Set\")\n", " load_random_btn = gr.Button(\"Load Random\")\n", " clean_imgs_btn = gr.Button(\"Clean Images\")\n", " wait_btn = gr.Button(\"Wait\")\n", " do_all_btn = gr.Button(\"Do All\")\n", " track_tqdm_btn = gr.Button(\"Bind TQDM\")\n", " bind_internal_tqdm_btn = gr.Button(\"Bind Internal TQDM\")\n", "\n", " text2 = gr.Textbox()\n", "\n", " # track list\n", " def load_set(text, text2, progress=gr.Progress()):\n", " imgs = [None] * 24\n", " for img in progress.tqdm(imgs, desc=\"Loading from list\"):\n", " time.sleep(0.1)\n", " return \"done\"\n", " load_set_btn.click(load_set, [text, textb], text2)\n", "\n", " # track nested list\n", " def load_nested_set(text, text2, progress=gr.Progress()):\n", " imgs = [[None] * 8] * 3\n", " for img_set in progress.tqdm(imgs, desc=\"Nested list\"):\n", " time.sleep(2)\n", " for img in progress.tqdm(img_set, desc=\"inner list\"):\n", " time.sleep(0.1)\n", " return \"done\"\n", " load_nested_set_btn.click(load_nested_set, [text, textb], text2)\n", "\n", " # track iterable of unknown length\n", " def load_random(data, progress=gr.Progress()):\n", " def yielder():\n", " for i in range(0, random.randint(15, 20)):\n", " time.sleep(0.1)\n", " yield None\n", " for img in progress.tqdm(yielder()):\n", " pass\n", " return \"done\"\n", " load_random_btn.click(load_random, {text, textb}, text2)\n", " \n", " # manual progress\n", " def clean_imgs(text, progress=gr.Progress()):\n", " progress(0.2, desc=\"Collecting Images\")\n", " time.sleep(1)\n", " progress(0.5, desc=\"Cleaning Images\")\n", " time.sleep(1.5)\n", " progress(0.8, desc=\"Sending Images\")\n", " time.sleep(1.5)\n", " return \"done\"\n", " clean_imgs_btn.click(clean_imgs, text, text2)\n", "\n", " # no progress\n", " def wait(text):\n", " time.sleep(4)\n", " return \"done\"\n", " wait_btn.click(wait, text, text2)\n", "\n", " # multiple progressions\n", " def do_all(data, progress=gr.Progress()):\n", " load_set(data[text], data[textb], progress)\n", " load_random(data, progress)\n", " clean_imgs(data[text], progress)\n", " progress(None)\n", " wait(text)\n", " return \"done\"\n", " do_all_btn.click(do_all, {text, textb}, text2)\n", "\n", " def track_tqdm(data, progress=gr.Progress(track_tqdm=True)):\n", " for i in tqdm.tqdm(range(5), desc=\"outer\"):\n", " for j in tqdm.tqdm(range(4), desc=\"inner\"):\n", " time.sleep(1)\n", " return \"done\"\n", " track_tqdm_btn.click(track_tqdm, {text, textb}, text2)\n", "\n", " def bind_internal_tqdm(data, progress=gr.Progress(track_tqdm=True)):\n", " outdir = \"__tmp/\" + str(uuid4())\n", " dataset = load_dataset(\"beans\", split=\"train\", cache_dir=outdir)\n", " shutil.rmtree(outdir)\n", " return \"done\"\n", " bind_internal_tqdm_btn.click(bind_internal_tqdm, {text, textb}, text2)\n", "\n", "\n", "if __name__ == \"__main__\":\n", " demo.queue(concurrency_count=20).launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5}

97
demo/progress/run.py Normal file
View File

@ -0,0 +1,97 @@
import gradio as gr
import random
import time
import tqdm
from datasets import load_dataset
import shutil
from uuid import uuid4
with gr.Blocks() as demo:
with gr.Row():
text = gr.Textbox()
textb = gr.Textbox()
with gr.Row():
load_set_btn = gr.Button("Load Set")
load_nested_set_btn = gr.Button("Load Nested Set")
load_random_btn = gr.Button("Load Random")
clean_imgs_btn = gr.Button("Clean Images")
wait_btn = gr.Button("Wait")
do_all_btn = gr.Button("Do All")
track_tqdm_btn = gr.Button("Bind TQDM")
bind_internal_tqdm_btn = gr.Button("Bind Internal TQDM")
text2 = gr.Textbox()
# track list
def load_set(text, text2, progress=gr.Progress()):
imgs = [None] * 24
for img in progress.tqdm(imgs, desc="Loading from list"):
time.sleep(0.1)
return "done"
load_set_btn.click(load_set, [text, textb], text2)
# track nested list
def load_nested_set(text, text2, progress=gr.Progress()):
imgs = [[None] * 8] * 3
for img_set in progress.tqdm(imgs, desc="Nested list"):
time.sleep(2)
for img in progress.tqdm(img_set, desc="inner list"):
time.sleep(0.1)
return "done"
load_nested_set_btn.click(load_nested_set, [text, textb], text2)
# track iterable of unknown length
def load_random(data, progress=gr.Progress()):
def yielder():
for i in range(0, random.randint(15, 20)):
time.sleep(0.1)
yield None
for img in progress.tqdm(yielder()):
pass
return "done"
load_random_btn.click(load_random, {text, textb}, text2)
# manual progress
def clean_imgs(text, progress=gr.Progress()):
progress(0.2, desc="Collecting Images")
time.sleep(1)
progress(0.5, desc="Cleaning Images")
time.sleep(1.5)
progress(0.8, desc="Sending Images")
time.sleep(1.5)
return "done"
clean_imgs_btn.click(clean_imgs, text, text2)
# no progress
def wait(text):
time.sleep(4)
return "done"
wait_btn.click(wait, text, text2)
# multiple progressions
def do_all(data, progress=gr.Progress()):
load_set(data[text], data[textb], progress)
load_random(data, progress)
clean_imgs(data[text], progress)
progress(None)
wait(text)
return "done"
do_all_btn.click(do_all, {text, textb}, text2)
def track_tqdm(data, progress=gr.Progress(track_tqdm=True)):
for i in tqdm.tqdm(range(5), desc="outer"):
for j in tqdm.tqdm(range(4), desc="inner"):
time.sleep(1)
return "done"
track_tqdm_btn.click(track_tqdm, {text, textb}, text2)
def bind_internal_tqdm(data, progress=gr.Progress(track_tqdm=True)):
outdir = "__tmp/" + str(uuid4())
dataset = load_dataset("beans", split="train", cache_dir=outdir)
shutil.rmtree(outdir)
return "done"
bind_internal_tqdm_btn.click(bind_internal_tqdm, {text, textb}, text2)
if __name__ == "__main__":
demo.queue(concurrency_count=20).launch()

View File

@ -0,0 +1 @@
{"cells": [{"cell_type": "markdown", "id": 302934307671667531413257853548643485645, "metadata": {}, "source": ["# Gradio Demo: progress_simple"]}, {"cell_type": "code", "execution_count": null, "id": 272996653310673477252411125948039410165, "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": 288918539441861185822528903084949547379, "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "import time\n", "\n", "def slowly_reverse(word, progress=gr.Progress()):\n", " progress(0, desc=\"Starting\")\n", " time.sleep(1)\n", " progress(0.05)\n", " new_string = \"\"\n", " for letter in progress.tqdm(word, desc=\"Reversing\"):\n", " time.sleep(0.25)\n", " new_string = letter + new_string\n", " return new_string\n", "\n", "demo = gr.Interface(slowly_reverse, gr.Text(), gr.Text())\n", "\n", "if __name__ == \"__main__\":\n", " demo.queue(concurrency_count=10).launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5}

View File

@ -0,0 +1,17 @@
import gradio as gr
import time
def slowly_reverse(word, progress=gr.Progress()):
progress(0, desc="Starting")
time.sleep(1)
progress(0.05)
new_string = ""
for letter in progress.tqdm(word, desc="Reversing"):
time.sleep(0.25)
new_string = letter + new_string
return new_string
demo = gr.Interface(slowly_reverse, gr.Text(), gr.Text())
if __name__ == "__main__":
demo.queue(concurrency_count=10).launch()

View File

@ -5,7 +5,7 @@ import gradio.inputs as inputs
import gradio.outputs as outputs
import gradio.processing_utils
import gradio.templates
from gradio.blocks import Blocks, skip, update
from gradio.blocks import Blocks
from gradio.components import (
HTML,
JSON,
@ -49,7 +49,6 @@ from gradio.components import (
Video,
component,
)
from gradio.examples import create_examples as Examples
from gradio.exceptions import Error
from gradio.flagging import (
CSVLogger,
@ -58,6 +57,9 @@ from gradio.flagging import (
HuggingFaceDatasetSaver,
SimpleCSVLogger,
)
from gradio.helpers import Progress
from gradio.helpers import create_examples as Examples
from gradio.helpers import make_waveform, skip, update
from gradio.interface import Interface, TabbedInterface, close_all
from gradio.ipython_ext import load_ipython_extension
from gradio.layouts import Accordion, Box, Column, Group, Row, Tab, TabItem, Tabs
@ -79,7 +81,6 @@ from gradio.templates import (
TextArea,
Webcam,
)
from gradio.utils import make_waveform
current_pkg_version = pkgutil.get_data(__name__, "version.txt").decode("ascii").strip()
__version__ = current_pkg_version

View File

@ -9,7 +9,6 @@ import pkgutil
import random
import sys
import time
import typing
import warnings
import webbrowser
from abc import abstractmethod
@ -36,6 +35,7 @@ from gradio.context import Context
from gradio.deprecation import check_deprecated_parameters
from gradio.documentation import document, set_documentation_group
from gradio.exceptions import DuplicateBlockError, InvalidApiName
from gradio.helpers import create_tracker, skip, special_args
from gradio.tunneling import CURRENT_TUNNELS
from gradio.utils import (
TupleNoPrint,
@ -357,58 +357,6 @@ class class_or_instancemethod(classmethod):
return descr_get(instance, type_)
set_documentation_group("component-helpers")
@document()
def update(**kwargs) -> dict:
"""
Updates component properties. When a function passed into a Gradio Interface or a Blocks events returns a typical value, it updates the value of the output component. But it is also possible to update the properties of an output component (such as the number of lines of a `Textbox` or the visibility of an `Image`) by returning the component's `update()` function, which takes as parameters any of the constructor parameters for that component.
This is a shorthand for using the update method on a component.
For example, rather than using gr.Number.update(...) you can just use gr.update(...).
Note that your editor's autocompletion will suggest proper parameters
if you use the update method on the component.
Demos: blocks_essay, blocks_update, blocks_essay_update
Parameters:
kwargs: Key-word arguments used to update the component's properties.
Example:
# Blocks Example
import gradio as gr
with gr.Blocks() as demo:
radio = gr.Radio([1, 2, 4], label="Set the value of the number")
number = gr.Number(value=2, interactive=True)
radio.change(fn=lambda value: gr.update(value=value), inputs=radio, outputs=number)
demo.launch()
# Interface example
import gradio as gr
def change_textbox(choice):
if choice == "short":
return gr.Textbox.update(lines=2, visible=True)
elif choice == "long":
return gr.Textbox.update(lines=8, visible=True)
else:
return gr.Textbox.update(visible=False)
gr.Interface(
change_textbox,
gr.Radio(
["short", "long", "none"], label="What kind of essay would you like to write?"
),
gr.Textbox(lines=2),
live=True,
).launch()abstrac
"""
kwargs["__type__"] = "generic_update"
return kwargs
set_documentation_group("blocks")
def skip() -> dict:
return update()
def postprocess_update_dict(block: Block, update_dict: Dict, postprocess: bool = True):
"""
Converts a dictionary of updates into a format that can be sent to the frontend.
@ -460,25 +408,6 @@ def convert_component_dict_to_list(
return predictions
def add_request_to_inputs(
fn: Callable,
inputs: List[Any],
request: routes.Request | List[routes.Request] | None,
):
"""
Adds the FastAPI Request object to the inputs of a function if the type of the parameter is FastAPI.Request.
"""
param_names = inspect.getfullargspec(fn)[0]
try:
parameter_types = typing.get_type_hints(fn)
for idx, param_name in enumerate(param_names):
if parameter_types.get(param_name, "") == routes.Request:
inputs.insert(idx, request)
except TypeError: # A TypeError is raised if the function is a partial or other rare cases.
pass
return inputs
@document("load")
class Blocks(BlockContext):
"""
@ -850,9 +779,18 @@ class Blocks(BlockContext):
fn_index: int,
processed_input: List[Any],
iterator: Iterator[Any] | None = None,
request: routes.Request | List[routes.Request] | None = None,
requests: routes.Request | List[routes.Request] | None = None,
event_id: str | None = None,
):
"""Calls and times function with given index and preprocessed input."""
"""
Calls function with given index and preprocessed input, and measures process time.
Parameters:
fn_index: index of function to call
processed_input: preprocessed input to pass to function
iterator: iterator to use if function is a generator
requests: requests to pass to function
event_id: id of event in queue
"""
block_fn = self.fns[fn_index]
assert block_fn.fn, f"function with index {fn_index} not defined."
is_generating = False
@ -864,18 +802,36 @@ class Blocks(BlockContext):
for input_component, data in zip(block_fn.inputs, processed_input)
}
]
processed_input = add_request_to_inputs(
block_fn.fn, list(processed_input), request
if isinstance(requests, list):
request = requests[0]
else:
request = requests
processed_input, progress_index = special_args(
block_fn.fn,
processed_input,
request,
)
progress_tracker = (
processed_input[progress_index] if progress_index is not None else None
)
start = time.time()
if iterator is None: # If not a generator function that has already run
if inspect.iscoroutinefunction(block_fn.fn):
prediction = await block_fn.fn(*processed_input)
if progress_tracker is not None and progress_index is not None:
progress_tracker, fn = create_tracker(
self, event_id, block_fn.fn, progress_tracker.track_tqdm
)
processed_input[progress_index] = progress_tracker
else:
fn = block_fn.fn
if inspect.iscoroutinefunction(fn):
prediction = await fn(*processed_input)
else:
prediction = await anyio.to_thread.run_sync(
block_fn.fn, *processed_input, limiter=self.limiter
fn, *processed_input, limiter=self.limiter
)
else:
prediction = None
@ -1008,6 +964,7 @@ class Blocks(BlockContext):
state: Dict[int, Any],
request: routes.Request | List[routes.Request] | None = None,
iterators: Dict[int, Any] | None = None,
event_id: str | None = None,
) -> Dict[str, Any]:
"""
Processes API calls from the frontend. First preprocesses the data,
@ -1055,7 +1012,9 @@ class Blocks(BlockContext):
else:
inputs = self.preprocess_data(fn_index, inputs, state)
iterator = iterators.get(fn_index, None) if iterators else None
result = await self.call_function(fn_index, inputs, iterator, request)
result = await self.call_function(
fn_index, inputs, iterator, request, event_id
)
data = self.postprocess_data(fn_index, result["prediction"], state)
is_generating, iterator = result["is_generating"], result["iterator"]
@ -1354,6 +1313,11 @@ class Blocks(BlockContext):
self.height = height
self.width = width
self.favicon_path = favicon_path
self.progress_tracking = any(
block_fn.fn is not None and special_args(block_fn.fn)[1] is not None
for block_fn in self.fns
)
if enable_queue is not None:
self.enable_queue = enable_queue
warnings.warn(
@ -1369,6 +1333,9 @@ class Blocks(BlockContext):
self.queue()
self.show_api = self.api_open if self.enable_queue else show_api
if not self.enable_queue and self.progress_tracking:
raise ValueError("Progress tracking requires queuing to be enabled.")
for dep in self.dependencies:
for i in dep["cancels"]:
if not self.queue_enabled_for_fn(i):
@ -1695,8 +1662,9 @@ class Blocks(BlockContext):
def startup_events(self):
"""Events that should be run when the app containing this block starts up."""
if self.enable_queue:
utils.run_coro_in_background(self._queue.start)
utils.run_coro_in_background(self._queue.start, (self.progress_tracking,))
utils.run_coro_in_background(self.create_limiter)
def queue_enabled_for_fn(self, fn_index: int):

View File

@ -6,6 +6,7 @@ from pydantic import BaseModel
class PredictBody(BaseModel):
session_hash: Optional[str]
event_id: Optional[str]
data: List[Any]
fn_index: Optional[int]
batched: Optional[
@ -26,3 +27,26 @@ class InterfaceTypes(Enum):
INPUT_ONLY = auto()
OUTPUT_ONLY = auto()
UNIFIED = auto()
class Estimation(BaseModel):
msg: Optional[str] = "estimation"
rank: Optional[int] = None
queue_size: int
avg_event_process_time: Optional[float]
avg_event_concurrent_process_time: Optional[float]
rank_eta: Optional[int] = None
queue_eta: int
class ProgressUnit(BaseModel):
index: Optional[int]
length: Optional[int]
unit: Optional[str]
progress: Optional[float]
desc: Optional[str]
class Progress(BaseModel):
msg: str = "progress"
progress_data: List[ProgressUnit] = []

View File

@ -58,8 +58,8 @@ def document_fn(fn: Callable) -> Tuple[str, List[Dict], Dict, Optional[str]]:
if mode == "description":
description.append(line if line.strip() else "<br>")
continue
assert line.startswith(
" "
assert (
line.startswith(" ") or line.strip() == ""
), f"Documentation format for {fn.__name__} has format error in line: {line}"
line = line[4:]
if mode == "parameter":

View File

@ -31,7 +31,11 @@ def set_cancel_events(
)
class Changeable(Block):
class EventListener(Block):
pass
class Changeable(EventListener):
def change(
self,
fn: Callable | None,
@ -94,7 +98,7 @@ class Changeable(Block):
return dep
class Clickable(Block):
class Clickable(EventListener):
def click(
self,
fn: Callable | None,
@ -158,7 +162,7 @@ class Clickable(Block):
return dep
class Submittable(Block):
class Submittable(EventListener):
def submit(
self,
fn: Callable | None,
@ -223,7 +227,7 @@ class Submittable(Block):
return dep
class Editable(Block):
class Editable(EventListener):
def edit(
self,
fn: Callable | None,
@ -287,7 +291,7 @@ class Editable(Block):
return dep
class Clearable(Block):
class Clearable(EventListener):
def clear(
self,
fn: Callable | None,
@ -351,7 +355,7 @@ class Clearable(Block):
return dep
class Playable(Block):
class Playable(EventListener):
def play(
self,
fn: Callable | None,
@ -539,7 +543,7 @@ class Playable(Block):
return dep
class Streamable(Block):
class Streamable(EventListener):
def stream(
self,
fn: Callable | None,
@ -605,7 +609,7 @@ class Streamable(Block):
return dep
class Blurrable(Block):
class Blurrable(EventListener):
def blur(
self,
fn: Callable | None,
@ -662,7 +666,7 @@ class Blurrable(Block):
set_cancel_events(self, "blur", cancels)
class Uploadable(Block):
class Uploadable(EventListener):
def upload(
self,
fn: Callable | None,

792
gradio/helpers.py Normal file
View File

@ -0,0 +1,792 @@
"""
Defines helper methods useful for loading and caching Interface examples.
"""
from __future__ import annotations
import ast
import csv
import inspect
import os
import subprocess
import tempfile
import threading
import warnings
from pathlib import Path
from typing import TYPE_CHECKING, Any, Callable, Iterable, List, Optional, Tuple
import matplotlib
import matplotlib.pyplot as plt
import numpy as np
import PIL
from gradio import processing_utils, routes, utils
from gradio.context import Context
from gradio.documentation import document, set_documentation_group
from gradio.flagging import CSVLogger
if TYPE_CHECKING: # Only import for type checking (to avoid circular imports).
from gradio.components import IOComponent
CACHED_FOLDER = "gradio_cached_examples"
LOG_FILE = "log.csv"
set_documentation_group("helpers")
def create_examples(
examples: List[Any] | List[List[Any]] | str,
inputs: IOComponent | List[IOComponent],
outputs: IOComponent | List[IOComponent] | None = None,
fn: Callable | None = None,
cache_examples: bool = False,
examples_per_page: int = 10,
_api_mode: bool = False,
label: str | None = None,
elem_id: str | None = None,
run_on_click: bool = False,
preprocess: bool = True,
postprocess: bool = True,
batch: bool = False,
):
"""Top-level synchronous function that creates Examples. Provided for backwards compatibility, i.e. so that gr.Examples(...) can be used to create the Examples component."""
examples_obj = Examples(
examples=examples,
inputs=inputs,
outputs=outputs,
fn=fn,
cache_examples=cache_examples,
examples_per_page=examples_per_page,
_api_mode=_api_mode,
label=label,
elem_id=elem_id,
run_on_click=run_on_click,
preprocess=preprocess,
postprocess=postprocess,
batch=batch,
_initiated_directly=False,
)
utils.synchronize_async(examples_obj.create)
return examples_obj
@document()
class Examples:
"""
This class is a wrapper over the Dataset component and can be used to create Examples
for Blocks / Interfaces. Populates the Dataset component with examples and
assigns event listener so that clicking on an example populates the input/output
components. Optionally handles example caching for fast inference.
Demos: blocks_inputs, fake_gan
Guides: more_on_examples_and_flagging, using_hugging_face_integrations, image_classification_in_pytorch, image_classification_in_tensorflow, image_classification_with_vision_transformers, create_your_own_friends_with_a_gan
"""
def __init__(
self,
examples: List[Any] | List[List[Any]] | str,
inputs: IOComponent | List[IOComponent],
outputs: Optional[IOComponent | List[IOComponent]] = None,
fn: Optional[Callable] = None,
cache_examples: bool = False,
examples_per_page: int = 10,
_api_mode: bool = False,
label: str = "Examples",
elem_id: Optional[str] = None,
run_on_click: bool = False,
preprocess: bool = True,
postprocess: bool = True,
batch: bool = False,
_initiated_directly: bool = True,
):
"""
Parameters:
examples: example inputs that can be clicked to populate specific components. Should be nested list, in which the outer list consists of samples and each inner list consists of an input corresponding to each input component. A string path to a directory of examples can also be provided but it should be within the directory with the python file running the gradio app. If there are multiple input components and a directory is provided, a log.csv file must be present in the directory to link corresponding inputs.
inputs: the component or list of components corresponding to the examples
outputs: optionally, provide the component or list of components corresponding to the output of the examples. Required if `cache` is True.
fn: optionally, provide the function to run to generate the outputs corresponding to the examples. Required if `cache` is True.
cache_examples: if True, caches examples for fast runtime. If True, then `fn` and `outputs` need to be provided
examples_per_page: how many examples to show per page.
label: the label to use for the examples component (by default, "Examples")
elem_id: an optional string that is assigned as the id of this component in the HTML DOM.
run_on_click: if cache_examples is False, clicking on an example does not run the function when an example is clicked. Set this to True to run the function when an example is clicked. Has no effect if cache_examples is True.
preprocess: if True, preprocesses the example input before running the prediction function and caching the output. Only applies if cache_examples is True.
postprocess: if True, postprocesses the example output after running the prediction function and before caching. Only applies if cache_examples is True.
batch: If True, then the function should process a batch of inputs, meaning that it should accept a list of input values for each parameter. Used only if cache_examples is True.
"""
if _initiated_directly:
warnings.warn(
"Please use gr.Examples(...) instead of gr.examples.Examples(...) to create the Examples.",
)
if cache_examples and (fn is None or outputs is None):
raise ValueError("If caching examples, `fn` and `outputs` must be provided")
if not isinstance(inputs, list):
inputs = [inputs]
if not isinstance(outputs, list):
outputs = [outputs]
working_directory = Path().absolute()
if examples is None:
raise ValueError("The parameter `examples` cannot be None")
elif isinstance(examples, list) and (
len(examples) == 0 or isinstance(examples[0], list)
):
pass
elif (
isinstance(examples, list) and len(inputs) == 1
): # If there is only one input component, examples can be provided as a regular list instead of a list of lists
examples = [[e] for e in examples]
elif isinstance(examples, str):
if not os.path.exists(examples):
raise FileNotFoundError(
"Could not find examples directory: " + examples
)
working_directory = examples
if not os.path.exists(os.path.join(examples, LOG_FILE)):
if len(inputs) == 1:
examples = [[e] for e in os.listdir(examples)]
else:
raise FileNotFoundError(
"Could not find log file (required for multiple inputs): "
+ LOG_FILE
)
else:
with open(os.path.join(examples, LOG_FILE)) as logs:
examples = list(csv.reader(logs))
examples = [
examples[i][: len(inputs)] for i in range(1, len(examples))
] # remove header and unnecessary columns
else:
raise ValueError(
"The parameter `examples` must either be a string directory or a list"
"(if there is only 1 input component) or (more generally), a nested "
"list, where each sublist represents a set of inputs."
)
input_has_examples = [False] * len(inputs)
for example in examples:
for idx, example_for_input in enumerate(example):
if not (example_for_input is None):
try:
input_has_examples[idx] = True
except IndexError:
pass # If there are more example components than inputs, ignore. This can sometimes be intentional (e.g. loading from a log file where outputs and timestamps are also logged)
inputs_with_examples = [
inp for (inp, keep) in zip(inputs, input_has_examples) if keep
]
non_none_examples = [
[ex for (ex, keep) in zip(example, input_has_examples) if keep]
for example in examples
]
self.examples = examples
self.non_none_examples = non_none_examples
self.inputs = inputs
self.inputs_with_examples = inputs_with_examples
self.outputs = outputs
self.fn = fn
self.cache_examples = cache_examples
self._api_mode = _api_mode
self.preprocess = preprocess
self.postprocess = postprocess
self.batch = batch
with utils.set_directory(working_directory):
self.processed_examples = [
[
component.postprocess(sample)
for component, sample in zip(inputs, example)
]
for example in examples
]
self.non_none_processed_examples = [
[ex for (ex, keep) in zip(example, input_has_examples) if keep]
for example in self.processed_examples
]
if cache_examples:
for example in self.examples:
if len([ex for ex in example if ex is not None]) != len(self.inputs):
warnings.warn(
"Examples are being cached but not all input components have "
"example values. This may result in an exception being thrown by "
"your function. If you do get an error while caching examples, make "
"sure all of your inputs have example values for all of your examples "
"or you provide default values for those particular parameters in your function."
)
break
from gradio.components import Dataset
with utils.set_directory(working_directory):
self.dataset = Dataset(
components=inputs_with_examples,
samples=non_none_examples,
type="index",
label=label,
samples_per_page=examples_per_page,
elem_id=elem_id,
)
self.cached_folder = os.path.join(CACHED_FOLDER, str(self.dataset._id))
self.cached_file = os.path.join(self.cached_folder, "log.csv")
self.cache_examples = cache_examples
self.run_on_click = run_on_click
async def create(self) -> None:
"""Caches the examples if self.cache_examples is True and creates the Dataset
component to hold the examples"""
async def load_example(example_id):
if self.cache_examples:
processed_example = self.non_none_processed_examples[
example_id
] + await self.load_from_cache(example_id)
else:
processed_example = self.non_none_processed_examples[example_id]
return utils.resolve_singleton(processed_example)
if Context.root_block:
self.dataset.click(
load_example,
inputs=[self.dataset],
outputs=self.inputs_with_examples
+ (self.outputs if self.cache_examples else []),
postprocess=False,
queue=False,
)
if self.run_on_click and not self.cache_examples:
self.dataset.click(
self.fn,
inputs=self.inputs,
outputs=self.outputs,
)
if self.cache_examples:
await self.cache()
async def cache(self) -> None:
"""
Caches all of the examples so that their predictions can be shown immediately.
"""
if os.path.exists(self.cached_file):
print(
f"Using cache from '{os.path.abspath(self.cached_folder)}' directory. If method or examples have changed since last caching, delete this folder to clear cache."
)
else:
if Context.root_block is None:
raise ValueError("Cannot cache examples if not in a Blocks context")
print(f"Caching examples at: '{os.path.abspath(self.cached_file)}'")
cache_logger = CSVLogger()
# create a fake dependency to process the examples and get the predictions
dependency = Context.root_block.set_event_trigger(
event_name="fake_event",
fn=self.fn,
inputs=self.inputs_with_examples,
outputs=self.outputs,
preprocess=self.preprocess and not self._api_mode,
postprocess=self.postprocess and not self._api_mode,
batch=self.batch,
)
fn_index = Context.root_block.dependencies.index(dependency)
cache_logger.setup(self.outputs, self.cached_folder)
for example_id, _ in enumerate(self.examples):
processed_input = self.processed_examples[example_id]
if self.batch:
processed_input = [[value] for value in processed_input]
prediction = await Context.root_block.process_api(
fn_index=fn_index, inputs=processed_input, request=None, state={}
)
output = prediction["data"]
if self.batch:
output = [value[0] for value in output]
cache_logger.flag(output)
# Remove the "fake_event" to prevent bugs in loading interfaces from spaces
Context.root_block.dependencies.remove(dependency)
Context.root_block.fns.pop(fn_index)
async def load_from_cache(self, example_id: int) -> List[Any]:
"""Loads a particular cached example for the interface.
Parameters:
example_id: The id of the example to process (zero-indexed).
"""
with open(self.cached_file) as cache:
examples = list(csv.reader(cache))
example = examples[example_id + 1] # +1 to adjust for header
output = []
for component, value in zip(self.outputs, example):
try:
value_as_dict = ast.literal_eval(value)
assert utils.is_update(value_as_dict)
output.append(value_as_dict)
except (ValueError, TypeError, SyntaxError, AssertionError):
output.append(component.serialize(value, self.cached_folder))
return output
class TrackedIterable:
def __init__(
self,
iterable: Iterable,
index: int | None,
length: int | None,
desc: str | None,
unit: str | None,
_tqdm=None,
progress: float = None,
) -> None:
self.iterable = iterable
self.index = index
self.length = length
self.desc = desc
self.unit = unit
self._tqdm = _tqdm
self.progress = progress
@document("__call__", "tqdm")
class Progress(Iterable):
"""
The Progress class provides a custom progress tracker that is used in a function signature.
To attach a Progress tracker to a function, simply add a parameter right after the input parameters that has a default value set to a `gradio.Progress()` instance.
The Progress tracker can then be updated in the function by calling the Progress object or using the `tqdm` method on an Iterable.
The Progress tracker is currently only available with `queue()`.
Example:
import gradio as gr
import time
def my_function(x, progress=gr.Progress()):
progress(0, desc="Starting...")
time.sleep(1)
for i in progress.tqdm(range(100)):
time.sleep(0.1)
return x
gr.Interface(my_function, gr.Textbox(), gr.Textbox()).queue().launch()
Demos: progress
"""
def __init__(
self,
track_tqdm: bool = False,
_active: bool = False,
_callback: Callable = None,
_event_id: str = None,
):
"""
Parameters:
track_tqdm: If True, the Progress object will track any tqdm.tqdm iterations with the tqdm library in the function.
"""
self.track_tqdm = track_tqdm
self._active = _active
self._callback = _callback
self._event_id = _event_id
self.iterables: List[TrackedIterable] = []
def __len__(self):
return self.iterables[-1].length
def __iter__(self):
return self
def __next__(self):
"""
Updates progress tracker with next item in iterable.
"""
if self._active:
current_iterable = self.iterables[-1]
while (
not hasattr(current_iterable.iterable, "__next__")
and len(self.iterables) > 0
):
current_iterable = self.iterables.pop()
self._callback(
event_id=self._event_id,
iterables=self.iterables,
)
current_iterable.index += 1
try:
return next(current_iterable.iterable)
except StopIteration:
self.iterables.pop()
raise StopIteration
else:
return self
def __call__(
self,
progress: float | Tuple[int, int | None] | None,
desc: str | None = None,
total: float | None = None,
unit: str = "steps",
_tqdm=None,
):
"""
Updates progress tracker with progress and message text.
Parameters:
progress: If float, should be between 0 and 1 representing completion. If Tuple, first number represents steps completed, and second value represents total steps or None if unknown. If None, hides progress bar.
desc: description to display.
total: estimated total number of steps.
unit: unit of iterations.
"""
if self._active:
if isinstance(progress, tuple):
index, total = progress
progress = None
else:
index = None
self._callback(
event_id=self._event_id,
iterables=self.iterables
+ [TrackedIterable(None, index, total, desc, unit, _tqdm, progress)],
)
else:
return progress
def tqdm(
self,
iterable: Iterable | None,
desc: str = None,
total: float = None,
unit: str = "steps",
_tqdm=None,
*args,
**kwargs,
):
"""
Attaches progress tracker to iterable, like tqdm.
Parameters:
iterable: iterable to attach progress tracker to.
desc: description to display.
total: estimated total number of steps.
unit: unit of iterations.
"""
if iterable is None:
new_iterable = TrackedIterable(None, 0, total, desc, unit, _tqdm)
self.iterables.append(new_iterable)
self._callback(event_id=self._event_id, iterables=self.iterables)
return
length = len(iterable) if hasattr(iterable, "__len__") else None
self.iterables.append(
TrackedIterable(iter(iterable), 0, length, desc, unit, _tqdm)
)
return self
def update(self, n=1):
"""
Increases latest iterable with specified number of steps.
Parameters:
n: number of steps completed.
"""
if self._active and len(self.iterables) > 0:
current_iterable = self.iterables[-1]
current_iterable.index += n
self._callback(
event_id=self._event_id,
iterables=self.iterables,
)
else:
return
def close(self, _tqdm):
"""
Removes iterable with given _tqdm.
"""
if self._active:
for i in range(len(self.iterables)):
if id(self.iterables[i]._tqdm) == id(_tqdm):
self.iterables.pop(i)
break
self._callback(
event_id=self._event_id,
iterables=self.iterables,
)
else:
return
def create_tracker(root_blocks, event_id, fn, track_tqdm):
progress = Progress(
_active=True, _callback=root_blocks._queue.set_progress, _event_id=event_id
)
if not track_tqdm:
return progress, fn
try:
_tqdm = __import__("tqdm")
except ModuleNotFoundError:
return progress, fn
if not hasattr(root_blocks, "_progress_tracker_per_thread"):
root_blocks._progress_tracker_per_thread = {}
def init_tqdm(self, iterable=None, desc=None, *args, **kwargs):
self._progress = root_blocks._progress_tracker_per_thread.get(
threading.get_ident()
)
if self._progress is not None:
self._progress.event_id = event_id
self._progress.tqdm(iterable, desc, _tqdm=self, *args, **kwargs)
kwargs["file"] = open(os.devnull, "w")
self.__init__orig__(iterable, desc, *args, **kwargs)
def iter_tqdm(self):
if self._progress is not None:
return self._progress
else:
return self.__iter__orig__()
def update_tqdm(self, n=1):
if self._progress is not None:
self._progress.update(n)
return self.__update__orig__(n)
def close_tqdm(self):
if self._progress is not None:
self._progress.close(self)
return self.__close__orig__()
def exit_tqdm(self, exc_type, exc_value, traceback):
if self._progress is not None:
self._progress.close(self)
return self.__exit__orig__(exc_type, exc_value, traceback)
if not hasattr(_tqdm.tqdm, "__init__orig__"):
_tqdm.tqdm.__init__orig__ = _tqdm.tqdm.__init__
_tqdm.tqdm.__init__ = init_tqdm
if not hasattr(_tqdm.tqdm, "__update__orig__"):
_tqdm.tqdm.__update__orig__ = _tqdm.tqdm.update
_tqdm.tqdm.update = update_tqdm
if not hasattr(_tqdm.tqdm, "__close__orig__"):
_tqdm.tqdm.__close__orig__ = _tqdm.tqdm.close
_tqdm.tqdm.close = close_tqdm
if not hasattr(_tqdm.tqdm, "__exit__orig__"):
_tqdm.tqdm.__exit__orig__ = _tqdm.tqdm.__exit__
_tqdm.tqdm.__exit__ = exit_tqdm
if not hasattr(_tqdm.tqdm, "__iter__orig__"):
_tqdm.tqdm.__iter__orig__ = _tqdm.tqdm.__iter__
_tqdm.tqdm.__iter__ = iter_tqdm
if hasattr(_tqdm, "auto") and hasattr(_tqdm.auto, "tqdm"):
_tqdm.auto.tqdm = _tqdm.tqdm
def tracked_fn(*args):
thread_id = threading.get_ident()
root_blocks._progress_tracker_per_thread[thread_id] = progress
response = fn(*args)
del root_blocks._progress_tracker_per_thread[thread_id]
return response
return progress, tracked_fn
def special_args(
fn: Callable,
inputs: List[Any] | None = None,
request: routes.Request | None = None,
):
"""
Checks if function has special arguments Request (via annotation) or Progress (via default value).
If inputs is provided, these values will be loaded into the inputs array.
Parameters:
block_fn: function to check.
inputs: array to load special arguments into.
request: request to load into inputs.
Returns:
updated inputs, request index, progress index
"""
signature = inspect.signature(fn)
positional_args = []
for i, param in enumerate(signature.parameters.values()):
if param.kind not in (param.POSITIONAL_ONLY, param.POSITIONAL_OR_KEYWORD):
break
positional_args.append(param)
progress_index = None
for i, param in enumerate(positional_args):
if isinstance(param.default, Progress):
progress_index = i
if inputs is not None:
inputs.insert(i, param.default)
elif param.annotation == routes.Request:
if inputs is not None:
inputs.insert(i, request)
if inputs is not None:
while len(inputs) < len(positional_args):
i = len(inputs)
param = positional_args[i]
if param.default == param.empty:
warnings.warn("Unexpected argument. Filling with None.")
inputs.append(None)
else:
inputs.append(param.default)
return inputs or [], progress_index
@document()
def update(**kwargs) -> dict:
"""
Updates component properties. When a function passed into a Gradio Interface or a Blocks events returns a typical value, it updates the value of the output component. But it is also possible to update the properties of an output component (such as the number of lines of a `Textbox` or the visibility of an `Image`) by returning the component's `update()` function, which takes as parameters any of the constructor parameters for that component.
This is a shorthand for using the update method on a component.
For example, rather than using gr.Number.update(...) you can just use gr.update(...).
Note that your editor's autocompletion will suggest proper parameters
if you use the update method on the component.
Demos: blocks_essay, blocks_update, blocks_essay_update
Parameters:
kwargs: Key-word arguments used to update the component's properties.
Example:
# Blocks Example
import gradio as gr
with gr.Blocks() as demo:
radio = gr.Radio([1, 2, 4], label="Set the value of the number")
number = gr.Number(value=2, interactive=True)
radio.change(fn=lambda value: gr.update(value=value), inputs=radio, outputs=number)
demo.launch()
# Interface example
import gradio as gr
def change_textbox(choice):
if choice == "short":
return gr.Textbox.update(lines=2, visible=True)
elif choice == "long":
return gr.Textbox.update(lines=8, visible=True)
else:
return gr.Textbox.update(visible=False)
gr.Interface(
change_textbox,
gr.Radio(
["short", "long", "none"], label="What kind of essay would you like to write?"
),
gr.Textbox(lines=2),
live=True,
).launch()
"""
kwargs["__type__"] = "generic_update"
return kwargs
def skip() -> dict:
return update()
@document()
def make_waveform(
audio: str | Tuple[int, np.ndarray],
*,
bg_color: str = "#f3f4f6",
bg_image: str = None,
fg_alpha: float = 0.75,
bars_color: str | Tuple[str, str] = ("#fbbf24", "#ea580c"),
bar_count: int = 50,
bar_width: float = 0.6,
):
"""
Generates a waveform video from an audio file. Useful for creating an easy to share audio visualization. The output should be passed into a `gr.Video` component.
Parameters:
audio: Audio file path or tuple of (sample_rate, audio_data)
bg_color: Background color of waveform (ignored if bg_image is provided)
bg_image: Background image of waveform
fg_alpha: Opacity of foreground waveform
bars_color: Color of waveform bars. Can be a single color or a tuple of (start_color, end_color) of gradient
bar_count: Number of bars in waveform
bar_width: Width of bars in waveform. 1 represents full width, 0.5 represents half width, etc.
Returns:
A filepath to the output video.
"""
if isinstance(audio, str):
audio_file = audio
audio = processing_utils.audio_from_file(audio)
else:
tmp_wav = tempfile.NamedTemporaryFile(suffix=".wav", delete=False)
processing_utils.audio_to_file(audio[0], audio[1], tmp_wav.name)
audio_file = tmp_wav.name
duration = round(len(audio[1]) / audio[0], 4)
# Helper methods to create waveform
def hex_to_RGB(hex_str):
return [int(hex_str[i : i + 2], 16) for i in range(1, 6, 2)]
def get_color_gradient(c1, c2, n):
assert n > 1
c1_rgb = np.array(hex_to_RGB(c1)) / 255
c2_rgb = np.array(hex_to_RGB(c2)) / 255
mix_pcts = [x / (n - 1) for x in range(n)]
rgb_colors = [((1 - mix) * c1_rgb + (mix * c2_rgb)) for mix in mix_pcts]
return [
"#" + "".join([format(int(round(val * 255)), "02x") for val in item])
for item in rgb_colors
]
# Reshape audio to have a fixed number of bars
samples = audio[1]
if len(samples.shape) > 1:
samples = np.mean(samples, 1)
bins_to_pad = bar_count - (len(samples) % bar_count)
samples = np.pad(samples, [(0, bins_to_pad)])
samples = np.reshape(samples, (bar_count, -1))
samples = np.abs(samples)
samples = np.max(samples, 1)
matplotlib.use("Agg")
plt.clf()
# Plot waveform
color = (
bars_color
if isinstance(bars_color, str)
else get_color_gradient(bars_color[0], bars_color[1], bar_count)
)
plt.bar(
np.arange(0, bar_count),
samples * 2,
bottom=(-1 * samples),
width=bar_width,
color=color,
)
plt.axis("off")
plt.margins(x=0)
tmp_img = tempfile.NamedTemporaryFile(suffix=".png", delete=False)
savefig_kwargs = {"bbox_inches": "tight"}
if bg_image is not None:
savefig_kwargs["transparent"] = True
else:
savefig_kwargs["facecolor"] = bg_color
plt.savefig(tmp_img.name, **savefig_kwargs)
waveform_img = PIL.Image.open(tmp_img.name)
waveform_img = waveform_img.resize((1000, 200))
# Composite waveform with background image
if bg_image is not None:
waveform_array = np.array(waveform_img)
waveform_array[:, :, 3] = waveform_array[:, :, 3] * fg_alpha
waveform_img = PIL.Image.fromarray(waveform_array)
bg_img = PIL.Image.open(bg_image)
waveform_width, waveform_height = waveform_img.size
bg_width, bg_height = bg_img.size
if waveform_width != bg_width:
bg_img = bg_img.resize(
(waveform_width, 2 * int(bg_height * waveform_width / bg_width / 2))
)
bg_width, bg_height = bg_img.size
composite_height = max(bg_height, waveform_height)
composite = PIL.Image.new("RGBA", (waveform_width, composite_height), "#FFFFFF")
composite.paste(bg_img, (0, composite_height - bg_height))
composite.paste(
waveform_img, (0, composite_height - waveform_height), waveform_img
)
composite.save(tmp_img.name)
img_width, img_height = composite.size
else:
img_width, img_height = waveform_img.size
waveform_img.save(tmp_img.name)
# Convert waveform to video with ffmpeg
output_mp4 = tempfile.NamedTemporaryFile(suffix=".mp4", delete=False)
ffmpeg_cmd = f"""ffmpeg -loop 1 -i {tmp_img.name} -i {audio_file} -vf "color=c=#FFFFFF77:s={img_width}x{img_height}[bar];[0][bar]overlay=-w+(w/{duration})*t:H-h:shortest=1" -t {duration} -y {output_mp4.name}"""
subprocess.call(ffmpeg_cmd, shell=True)
return output_mp4.name

View File

@ -8,34 +8,28 @@ from collections import deque
from typing import Any, Deque, Dict, List, Optional, Tuple
import fastapi
from pydantic import BaseModel
from gradio.data_classes import PredictBody
from gradio.data_classes import Estimation, PredictBody, Progress, ProgressUnit
from gradio.helpers import TrackedIterable
from gradio.utils import AsyncRequest, run_coro_in_background, set_task_name
class Estimation(BaseModel):
msg: Optional[str] = "estimation"
rank: Optional[int] = None
queue_size: int
avg_event_process_time: Optional[float]
avg_event_concurrent_process_time: Optional[float]
rank_eta: Optional[int] = None
queue_eta: int
class Event:
def __init__(
self,
websocket: fastapi.WebSocket,
fn_index: int | None = None,
session_hash: str,
fn_index: int,
):
self.websocket = websocket
self.session_hash: str = session_hash
self.fn_index: int = fn_index
self._id = f"{self.session_hash}_{self.fn_index}"
self.data: PredictBody | None = None
self.lost_connection_time: float | None = None
self.fn_index: int | None = fn_index
self.session_hash: str = "foo"
self.token: str | None = None
self.progress: Progress | None = None
self.progress_pending: bool = False
async def disconnect(self, code=1000):
await self.websocket.close(code=code)
@ -65,12 +59,15 @@ class Queue:
self.queue_duration = 1
self.live_updates = live_updates
self.sleep_when_free = 0.05
self.progress_update_sleep_when_free = 0.1
self.max_size = max_size
self.blocks_dependencies = blocks_dependencies
self.access_token = ""
async def start(self):
async def start(self, progress_tracking=False):
run_coro_in_background(self.start_processing)
if progress_tracking:
run_coro_in_background(self.start_progress_tracking)
if not self.live_updates:
run_coro_in_background(self.notify_clients)
@ -132,6 +129,51 @@ class Queue:
run_coro_in_background(self.broadcast_live_estimations)
set_task_name(task, events[0].session_hash, events[0].fn_index, batch)
async def start_progress_tracking(self) -> None:
while not self.stopped:
if not any(self.active_jobs):
await asyncio.sleep(self.progress_update_sleep_when_free)
continue
for job in self.active_jobs:
if job is None:
continue
for event in job:
if event.progress_pending:
event.progress_pending = False
client_awake = await self.send_message(
event, event.progress.dict()
)
if not client_awake:
await self.clean_event(event)
await asyncio.sleep(self.progress_update_sleep_when_free)
def set_progress(
self,
event_id: str,
iterables: List[TrackedIterable] | None,
):
if iterables is None:
return
for job in self.active_jobs:
if job is None:
continue
for evt in job:
if evt._id == event_id:
progress_data: List[ProgressUnit] = []
for iterable in iterables:
progress_unit = ProgressUnit(
index=iterable.index,
length=iterable.length,
unit=iterable.unit,
progress=iterable.progress,
desc=iterable.desc,
)
progress_data.append(progress_unit)
evt.progress = Progress(progress_data=progress_data)
evt.progress_pending = True
def push(self, event: Event) -> int | None:
"""
Add event to queue, or return None if Queue is full
@ -254,6 +296,7 @@ class Queue:
async def call_prediction(self, events: List[Event], batch: bool):
data = events[0].data
token = events[0].token
data.event_id = events[0]._id if not batch else None
try:
data.request = self.get_request_params(events[0].websocket)
except ValueError:
@ -288,7 +331,7 @@ class Queue:
)
if client_awake:
awake_events.append(event)
if not (awake_events):
if not awake_events:
return
begin_time = time.time()
response = await self.call_prediction(awake_events, batch)

View File

@ -312,6 +312,7 @@ class App(FastAPI):
else:
session_state = {}
iterators = {}
event_id = getattr(body, "event_id", None)
raw_input = body.data
fn_index = body.fn_index
batch = app.blocks.dependencies[fn_index]["batch"]
@ -324,6 +325,7 @@ class App(FastAPI):
request=request,
state=session_state,
iterators=iterators,
event_id=event_id,
)
iterator = output.pop("iterator", None)
if hasattr(body, "session_hash"):
@ -403,16 +405,15 @@ class App(FastAPI):
app_url = get_server_url_from_ws_url(str(websocket.url))
app.blocks._queue.set_url(app_url)
await websocket.accept()
event = Event(websocket)
# set the token into Event to allow using the same token for call_prediction
event.token = token
# In order to cancel jobs, we need the session_hash and fn_index
# to create a unique id for each job
await websocket.send_json({"msg": "send_hash"})
session_hash = await websocket.receive_json()
event.session_hash = session_hash["session_hash"]
event.fn_index = session_hash["fn_index"]
session_info = await websocket.receive_json()
event = Event(
websocket, session_info["session_hash"], session_info["fn_index"]
)
# set the token into Event to allow using the same token for call_prediction
event.token = token
# Continuous events are not put in the queue so that they do not
# occupy the queue's resource as they are expected to run forever

View File

@ -11,9 +11,7 @@ import os
import pkgutil
import random
import re
import subprocess
import sys
import tempfile
import time
import typing
import warnings
@ -39,17 +37,12 @@ from typing import (
import aiohttp
import fsspec.asyn
import httpx
import matplotlib
import matplotlib.pyplot as plt
import numpy as np
import PIL
import requests
from pydantic import BaseModel, Json, parse_obj_as
import gradio
from gradio import processing_utils
from gradio.context import Context
from gradio.documentation import document, set_documentation_group
if TYPE_CHECKING: # Only import for type checking (is False at runtime).
from gradio.blocks import BlockContext
@ -816,129 +809,6 @@ class TupleNoPrint(tuple):
return ""
set_documentation_group("component-helpers")
@document()
def make_waveform(
audio: str | Tuple[int, np.ndarray],
*,
bg_color: str = "#f3f4f6",
bg_image: str = None,
fg_alpha: float = 0.75,
bars_color: str | Tuple[str, str] = ("#fbbf24", "#ea580c"),
bar_count: int = 50,
bar_width: float = 0.6,
):
"""
Generates a waveform video from an audio file. Useful for creating an easy to share audio visualization. The output should be passed into a `gr.Video` component.
Parameters:
audio: Audio file path or tuple of (sample_rate, audio_data)
bg_color: Background color of waveform (ignored if bg_image is provided)
bg_image: Background image of waveform
fg_alpha: Opacity of foreground waveform
bars_color: Color of waveform bars. Can be a single color or a tuple of (start_color, end_color) of gradient
bar_count: Number of bars in waveform
bar_width: Width of bars in waveform. 1 represents full width, 0.5 represents half width, etc.
Returns:
A filepath to the output video.
"""
if isinstance(audio, str):
audio_file = audio
audio = processing_utils.audio_from_file(audio)
else:
tmp_wav = tempfile.NamedTemporaryFile(suffix=".wav", delete=False)
processing_utils.audio_to_file(audio[0], audio[1], tmp_wav.name)
audio_file = tmp_wav.name
duration = round(len(audio[1]) / audio[0], 4)
# Helper methods to create waveform
def hex_to_RGB(hex_str):
return [int(hex_str[i : i + 2], 16) for i in range(1, 6, 2)]
def get_color_gradient(c1, c2, n):
assert n > 1
c1_rgb = np.array(hex_to_RGB(c1)) / 255
c2_rgb = np.array(hex_to_RGB(c2)) / 255
mix_pcts = [x / (n - 1) for x in range(n)]
rgb_colors = [((1 - mix) * c1_rgb + (mix * c2_rgb)) for mix in mix_pcts]
return [
"#" + "".join([format(int(round(val * 255)), "02x") for val in item])
for item in rgb_colors
]
# Reshape audio to have a fixed number of bars
samples = audio[1]
if len(samples.shape) > 1:
samples = np.mean(samples, 1)
bins_to_pad = bar_count - (len(samples) % bar_count)
samples = np.pad(samples, [(0, bins_to_pad)])
samples = np.reshape(samples, (bar_count, -1))
samples = np.abs(samples)
samples = np.max(samples, 1)
matplotlib.use("Agg")
plt.clf()
# Plot waveform
color = (
bars_color
if isinstance(bars_color, str)
else get_color_gradient(bars_color[0], bars_color[1], bar_count)
)
plt.bar(
np.arange(0, bar_count),
samples * 2,
bottom=(-1 * samples),
width=bar_width,
color=color,
)
plt.axis("off")
plt.margins(x=0)
tmp_img = tempfile.NamedTemporaryFile(suffix=".png", delete=False)
savefig_kwargs = {"bbox_inches": "tight"}
if bg_image is not None:
savefig_kwargs["transparent"] = True
else:
savefig_kwargs["facecolor"] = bg_color
plt.savefig(tmp_img.name, **savefig_kwargs)
waveform_img = PIL.Image.open(tmp_img.name)
waveform_img = waveform_img.resize((1000, 200))
# Composite waveform with background image
if bg_image is not None:
waveform_array = np.array(waveform_img)
waveform_array[:, :, 3] = waveform_array[:, :, 3] * fg_alpha
waveform_img = PIL.Image.fromarray(waveform_array)
bg_img = PIL.Image.open(bg_image)
waveform_width, waveform_height = waveform_img.size
bg_width, bg_height = bg_img.size
if waveform_width != bg_width:
bg_img = bg_img.resize(
(waveform_width, 2 * int(bg_height * waveform_width / bg_width / 2))
)
bg_width, bg_height = bg_img.size
composite_height = max(bg_height, waveform_height)
composite = PIL.Image.new("RGBA", (waveform_width, composite_height), "#FFFFFF")
composite.paste(bg_img, (0, composite_height - bg_height))
composite.paste(
waveform_img, (0, composite_height - waveform_height), waveform_img
)
composite.save(tmp_img.name)
img_width, img_height = composite.size
else:
img_width, img_height = waveform_img.size
waveform_img.save(tmp_img.name)
# Convert waveform to video with ffmpeg
output_mp4 = tempfile.NamedTemporaryFile(suffix=".mp4", delete=False)
ffmpeg_cmd = f"""ffmpeg -loop 1 -i {tmp_img.name} -i {audio_file} -vf "color=c=#FFFFFF77:s={img_width}x{img_height}[bar];[0][bar]overlay=-w+(w/{duration})*t:H-h:shortest=1" -t {duration} -y {output_mp4.name}"""
subprocess.call(ffmpeg_cmd, shell=True)
return output_mp4.name
def tex2svg(formula, *args):
FONTSIZE = 20
DPI = 300

View File

@ -182,6 +182,15 @@ Note that we've added a `time.sleep(1)` in the iterator to create an artificial
Supplying a generator into Gradio **requires** you to enable queuing in the underlying Interface or Blocks (see the queuing section above).
## Progress Bars
Gradio supports the ability to create a custom Progress Bars so that you have customizability and control over the progress update that you show to the user. In order to enable this, simply add an argument to your method that has a default value of a `gradio.Progress` instance. Then you can update the progress levels by calling this instance directly with a float between 0 and 1, or using the `tqdm()` method of the `Progress` instance to track progress over an iterable, as shown below. Queueing must be enabled for progress updates.
$code_progress_simple
$demo_progress_simple
If you use the `tqdm` library, you can even report progress updates automatically from any `tqdm.tqdm` that already exists within your function by setting the default argument as `gr.Progress(track_tqdm=True)`!
## Batch Functions
Gradio supports the ability to pass *batch* functions. Batch functions are just

View File

@ -22,3 +22,4 @@ respx
fastapi>=0.87.0
altair
vega_datasets
tqdm

View File

@ -1047,6 +1047,146 @@ class TestEvery:
assert "At step 9" not in captured.out
class TestProgressBar:
@pytest.mark.asyncio
async def test_progress_bar(self):
from tqdm import tqdm
with gr.Blocks() as demo:
name = gr.Textbox()
greeting = gr.Textbox()
button = gr.Button(value="Greet")
def greet(s, prog=gr.Progress()):
prog(0, desc="start")
time.sleep(0.25)
for _ in prog.tqdm(range(4), unit="iter"):
time.sleep(0.25)
time.sleep(1)
for i in tqdm(["a", "b", "c"], desc="alphabet"):
time.sleep(0.25)
return f"Hello, {s}!"
button.click(greet, name, greeting)
demo.queue(max_size=1).launch(prevent_thread_lock=True)
async with websockets.connect(
f"{demo.local_url.replace('http', 'ws')}queue/join"
) as ws:
completed = False
progress_updates = []
while not completed:
msg = json.loads(await ws.recv())
if msg["msg"] == "send_data":
await ws.send(json.dumps({"data": [0], "fn_index": 0}))
if msg["msg"] == "send_hash":
await ws.send(json.dumps({"fn_index": 0, "session_hash": "shdce"}))
if msg["msg"] == "progress":
progress_updates.append(msg["progress_data"])
if msg["msg"] == "process_completed":
completed = True
break
print(progress_updates)
assert progress_updates == [
[
{
"index": None,
"length": None,
"unit": "steps",
"progress": 0.0,
"desc": "start",
}
],
[{"index": 0, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 1, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 2, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 3, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 4, "length": 4, "unit": "iter", "progress": None, "desc": None}],
]
@pytest.mark.asyncio
async def test_progress_bar_track_tqdm(self):
from tqdm import tqdm
with gr.Blocks() as demo:
name = gr.Textbox()
greeting = gr.Textbox()
button = gr.Button(value="Greet")
def greet(s, prog=gr.Progress(track_tqdm=True)):
prog(0, desc="start")
time.sleep(0.25)
for _ in prog.tqdm(range(4), unit="iter"):
time.sleep(0.25)
time.sleep(1)
for i in tqdm(["a", "b", "c"], desc="alphabet"):
time.sleep(0.25)
return f"Hello, {s}!"
button.click(greet, name, greeting)
demo.queue(max_size=1).launch(prevent_thread_lock=True)
async with websockets.connect(
f"{demo.local_url.replace('http', 'ws')}queue/join"
) as ws:
completed = False
progress_updates = []
while not completed:
msg = json.loads(await ws.recv())
if msg["msg"] == "send_data":
await ws.send(json.dumps({"data": [0], "fn_index": 0}))
if msg["msg"] == "send_hash":
await ws.send(json.dumps({"fn_index": 0, "session_hash": "shdce"}))
if msg["msg"] == "progress":
progress_updates.append(msg["progress_data"])
if msg["msg"] == "process_completed":
completed = True
break
assert progress_updates == [
[
{
"index": None,
"length": None,
"unit": "steps",
"progress": 0.0,
"desc": "start",
}
],
[{"index": 0, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 1, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 2, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 3, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[{"index": 4, "length": 4, "unit": "iter", "progress": None, "desc": None}],
[
{
"index": 0,
"length": 3,
"unit": "steps",
"progress": None,
"desc": "alphabet",
}
],
[
{
"index": 1,
"length": 3,
"unit": "steps",
"progress": None,
"desc": "alphabet",
}
],
[
{
"index": 2,
"length": 3,
"unit": "steps",
"progress": None,
"desc": "alphabet",
}
],
]
class TestAddRequests:
def test_no_type_hints(self):
def moo(a, b):
@ -1054,12 +1194,12 @@ class TestAddRequests:
inputs = [1, 2]
request = gr.Request()
inputs_ = gr.blocks.add_request_to_inputs(moo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(moo, copy.deepcopy(inputs), request)[0]
assert inputs_ == inputs
boo = partial(moo, a=1)
inputs = [2]
inputs_ = gr.blocks.add_request_to_inputs(boo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(boo, copy.deepcopy(inputs), request)[0]
assert inputs_ == inputs
def test_no_type_hints_with_request(self):
@ -1068,12 +1208,12 @@ class TestAddRequests:
inputs = ["abc", 2]
request = gr.Request()
inputs_ = gr.blocks.add_request_to_inputs(moo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(moo, copy.deepcopy(inputs), request)[0]
assert inputs_ == inputs
boo = partial(moo, a="def")
inputs = [2]
inputs_ = gr.blocks.add_request_to_inputs(boo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(boo, copy.deepcopy(inputs), request)[0]
assert inputs_ == inputs
def test_type_hints_with_request(self):
@ -1082,7 +1222,7 @@ class TestAddRequests:
inputs = ["abc"]
request = gr.Request()
inputs_ = gr.blocks.add_request_to_inputs(moo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(moo, copy.deepcopy(inputs), request)[0]
assert inputs_ == inputs + [request]
def moo(a: gr.Request, b, c: int):
@ -1090,7 +1230,7 @@ class TestAddRequests:
inputs = ["abc", 5]
request = gr.Request()
inputs_ = gr.blocks.add_request_to_inputs(moo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(moo, copy.deepcopy(inputs), request)[0]
assert inputs_ == [request] + inputs
def test_type_hints_with_multiple_requests(self):
@ -1099,7 +1239,7 @@ class TestAddRequests:
inputs = ["abc"]
request = gr.Request()
inputs_ = gr.blocks.add_request_to_inputs(moo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(moo, copy.deepcopy(inputs), request)[0]
assert inputs_ == inputs + [request, request]
def moo(a: gr.Request, b, c: int, d: gr.Request):
@ -1107,7 +1247,7 @@ class TestAddRequests:
inputs = ["abc", 5]
request = gr.Request()
inputs_ = gr.blocks.add_request_to_inputs(moo, copy.deepcopy(inputs), request)
inputs_ = gr.helpers.special_args(moo, copy.deepcopy(inputs), request)[0]
assert inputs_ == [request] + inputs + [request]
@ -1195,7 +1335,7 @@ async def test_queue_when_using_auth():
loop = asyncio.get_event_loop()
tm = loop.time()
group = asyncio.gather(
*[run_ws(loop, tm + sleep_time * (i + 1) - 0.3, i) for i in range(3)]
*[run_ws(loop, tm + sleep_time * (i + 1) - 1, i) for i in range(3)]
)
await group

View File

@ -9,7 +9,7 @@ import gradio as gr
os.environ["GRADIO_ANALYTICS_ENABLED"] = "False"
@patch("gradio.examples.CACHED_FOLDER", tempfile.mkdtemp())
@patch("gradio.helpers.CACHED_FOLDER", tempfile.mkdtemp())
class TestExamples:
def test_handle_single_input(self):
examples = gr.Examples(["hello", "hi"], gr.Textbox())
@ -89,7 +89,7 @@ class TestExamples:
assert prediction[0][0][0]["data"] == gr.media_data.BASE64_IMAGE
@patch("gradio.examples.CACHED_FOLDER", tempfile.mkdtemp())
@patch("gradio.helpers.CACHED_FOLDER", tempfile.mkdtemp())
class TestExamplesDataset:
def test_no_headers(self):
examples = gr.Examples("test/test_files/images_log", [gr.Image(), gr.Text()])
@ -109,7 +109,7 @@ class TestExamplesDataset:
assert examples.dataset.headers == ["im", ""]
@patch("gradio.examples.CACHED_FOLDER", tempfile.mkdtemp())
@patch("gradio.helpers.CACHED_FOLDER", tempfile.mkdtemp())
class TestProcessExamples:
@pytest.mark.asyncio
async def test_caching(self):

View File

@ -280,7 +280,7 @@ class TestLoadInterface:
class TestLoadInterfaceWithExamples:
def test_interface_load_examples(self, tmp_path):
test_file_dir = pathlib.Path(pathlib.Path(__file__).parent, "test_files")
with patch("gradio.examples.CACHED_FOLDER", tmp_path):
with patch("gradio.helpers.CACHED_FOLDER", tmp_path):
gr.Interface.load(
name="models/google/vit-base-patch16-224",
examples=[pathlib.Path(test_file_dir, "cheetah1.jpg")],
@ -289,7 +289,7 @@ class TestLoadInterfaceWithExamples:
def test_interface_load_cache_examples(self, tmp_path):
test_file_dir = pathlib.Path(pathlib.Path(__file__).parent, "test_files")
with patch("gradio.examples.CACHED_FOLDER", tmp_path):
with patch("gradio.helpers.CACHED_FOLDER", tmp_path):
gr.Interface.load(
name="models/google/vit-base-patch16-224",
examples=[pathlib.Path(test_file_dir, "cheetah1.jpg")],

View File

@ -32,7 +32,7 @@ def queue() -> Queue:
@pytest.fixture()
def mock_event() -> Event:
websocket = MagicMock()
event = Event(websocket=websocket, fn_index=0)
event = Event(websocket=websocket, session_hash="test", fn_index=0)
yield event
@ -326,7 +326,7 @@ class TestQueueBatch:
queue.clean_event = AsyncMock()
websocket = MagicMock()
mock_event2 = Event(websocket=websocket, fn_index=0)
mock_event2 = Event(websocket=websocket, session_hash="test", fn_index=0)
mock_event2.disconnect = AsyncMock()
queue.active_jobs = [[mock_event, mock_event2]]
@ -350,10 +350,10 @@ class TestGetEventsInBatch:
queue.event_queue = deque()
queue.event_queue.extend(
[
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
]
)
events, batch = queue.get_events_in_batch()
@ -372,12 +372,12 @@ class TestGetEventsInBatch:
queue.event_queue = deque()
queue.event_queue.extend(
[
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=1),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=1),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=1),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=1),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
]
)
events, batch = queue.get_events_in_batch()
@ -400,11 +400,11 @@ class TestGetEventsInBatch:
queue.event_queue = deque()
queue.event_queue.extend(
[
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=1),
Event(websocket=MagicMock(), fn_index=0),
Event(websocket=MagicMock(), fn_index=1),
Event(websocket=MagicMock(), fn_index=1),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=1),
Event(websocket=MagicMock(), session_hash="test", fn_index=0),
Event(websocket=MagicMock(), session_hash="test", fn_index=1),
Event(websocket=MagicMock(), session_hash="test", fn_index=1),
]
)
events, batch = queue.get_events_in_batch()

View File

@ -191,6 +191,18 @@ export const fn =
null
);
break;
case "progress":
loading_status.update(
fn_index,
"pending",
queue,
null,
null,
null,
null,
data.progress_data
);
break;
case "process_generating":
loading_status.update(
fn_index,

View File

@ -1,6 +1,7 @@
<script context="module" lang="ts">
import { tick } from "svelte";
import { fade } from "svelte/transition";
import { prettySI } from "../utils/helpers";
let items: Array<HTMLDivElement> = [];
@ -48,6 +49,7 @@
import { onDestroy } from "svelte";
import { app_state } from "../../stores";
import Loader from "./Loader.svelte";
import type { LoadingStatus } from "./types";
export let eta: number | null = null;
export let queue: boolean = false;
@ -58,6 +60,7 @@
export let timer: boolean = true;
export let visible: boolean = true;
export let message: string | null = null;
export let progress: LoadingStatus["progress"] | null | undefined = null;
export let variant: "default" | "center" = "default";
let el: HTMLDivElement;
@ -67,18 +70,54 @@
let timer_diff = 0;
let old_eta: number | null = null;
let message_visible: boolean = false;
let eta_level: number | null = 0;
let progress_level: Array<number | undefined> | null = null;
let last_progress_level: number | undefined = undefined;
let progress_bar: HTMLElement | null = null;
let show_eta_bar: boolean = true;
$: progress =
$: eta_level =
eta === null || eta <= 0 || !timer_diff
? null
: Math.min(timer_diff / eta, 1);
$: if (progress != null) {
show_eta_bar = false;
}
$: {
if (progress != null) {
progress_level = progress.map((p) => {
if (p.index != null && p.length != null) {
return p.index / p.length;
} else if (p.progress != null) {
return p.progress;
} else {
return undefined;
}
});
} else {
progress_level = null;
}
if (progress_level) {
last_progress_level = progress_level[progress_level.length - 1];
if (progress_bar) {
if (last_progress_level === 0) {
progress_bar.classList.remove("transition-transform");
} else {
progress_bar.classList.add("transition-transform");
}
}
} else {
last_progress_level = undefined;
}
}
const start_timer = () => {
timer_start = performance.now();
timer_diff = 0;
_timer = true;
run();
// timer = setInterval(, 100);
};
function run() {
@ -153,26 +192,69 @@
bind:this={el}
>
{#if status === "pending"}
{#if variant === "default"}
<div class="progress-bar" style:transform="scaleX({progress || 0})" />
{#if variant === "default" && show_eta_bar}
<div class="eta-bar" style:transform="scaleX({eta_level || 0})" />
{/if}
<div
class="dark:text-gray-400"
class:meta-text-center={variant === "center"}
class:meta-text={variant === "default"}
>
{#if queue_position !== null && queue_size !== undefined && queue_position >= 0}
{#if progress}
{#each progress as p}
{#if p.index != null}
{#if p.length != null}
{prettySI(p.index || 0)}/{prettySI(p.length)}
{:else}
{prettySI(p.index || 0)}
{/if}
{p.unit} | {" "}
{/if}
{/each}
{:else if queue_position !== null && queue_size !== undefined && queue_position >= 0}
queue: {queue_position + 1}/{queue_size} |
{:else if queue_position === 0}
processing |
{/if}
{#if timer}
{formatted_timer}{eta ? `/${formatted_eta}` : ""}
{formatted_timer}{eta ? `/${formatted_eta}` : ""}s
{/if}
</div>
{#if last_progress_level != null}
<div class="z-20 w-full flex items-center flex-col gap-1">
<div class="m-2 mx-auto font-mono text-xs dark:text-gray-100">
{#if progress != null}
{#each progress as p, i}
{#if p.desc != null || (progress_level && progress_level[i] != null)}
{#if i !== 0}
&nbsp;/
{/if}
{#if p.desc != null}
{p.desc}
{/if}
{#if p.desc != null && progress_level && progress_level[i] != null}
-
{/if}
{#if progress_level != null}
{(100 * (progress_level[i] || 0)).toFixed(1)}%
{/if}
{/if}
{/each}
{/if}
</div>
<div class="w-2/3 h-4 rounded bg-white border">
<div
bind:this={progress_bar}
class="progress-bar"
style:transform="scaleX({last_progress_level})"
/>
</div>
</div>
{:else}
<Loader margin={variant === "default"} />
{/if}
{#if !timer}
<p class="timer">Loading...</p>
@ -199,7 +281,7 @@
>
</div>
<div class="px-3 py-3 text-base font-mono">
{message}
{message || ""}
</div>
</div>
</div>
@ -224,9 +306,12 @@
@apply border-2 border-orange-500 animate-pulse;
}
.progress-bar {
.eta-bar {
@apply absolute inset-0 origin-left bg-slate-100 dark:bg-gray-700 top-0 left-0 z-10 opacity-80;
}
.progress-bar {
@apply rounded inset-0 origin-left h-full w-full bg-orange-500;
}
.meta-text {
@apply absolute top-0 right-0 py-1 px-2 font-mono z-20 text-xs;

View File

@ -7,4 +7,11 @@ export interface LoadingStatus {
visible: boolean;
fn_index: number;
message?: string;
progress?: Array<{
progress: number | null;
index: number | null;
length: number | null;
unit: string | null;
desc: string | null;
}>;
}

View File

@ -58,3 +58,14 @@ export const prettyBytes = (bytes: number): string => {
let unit = units[i];
return bytes.toFixed(1) + " " + unit;
};
export const prettySI = (num: number): string => {
let units = ["", "k", "M", "G", "T", "P", "E", "Z"];
let i = 0;
while (num > 1000 && i < units.length - 1) {
num /= 1000;
i++;
}
let unit = units[i];
return (Number.isInteger(num) ? num : num.toFixed(1)) + unit;
};

View File

@ -10,6 +10,13 @@ export interface LoadingStatus {
message?: string | null;
scroll_to_output?: boolean;
visible?: boolean;
progress?: Array<{
progress: number | null;
index: number | null;
length: number | null;
unit: string | null;
desc: string | null;
}>;
}
export type LoadingStatusCollection = Record<number, LoadingStatus>;
@ -32,7 +39,8 @@ export function create_loading_status_store() {
size: LoadingStatus["queue_size"],
position: LoadingStatus["queue_position"],
eta: LoadingStatus["eta"],
message: LoadingStatus["message"]
message: LoadingStatus["message"],
progress?: LoadingStatus["progress"]
) {
const outputs = fn_outputs[fn_index];
const inputs = fn_inputs[fn_index];
@ -69,7 +77,8 @@ export function create_loading_status_store() {
queue_size: size,
eta: eta,
status: new_status,
message: message
message: message,
progress: progress
};
});
@ -91,13 +100,22 @@ export function create_loading_status_store() {
store.update((outputs) => {
outputs_to_update.forEach(
({ id, queue_position, queue_size, eta, status, message }) => {
({
id,
queue_position,
queue_size,
eta,
status,
message,
progress
}) => {
outputs[id] = {
queue: queue,
queue_size: queue_size,
queue_position: queue_position,
eta: eta,
message,
progress,
status,
fn_index
};

View File

@ -1,4 +1,4 @@
lockfileVersion: 5.4
lockfileVersion: 5.3
importers:
@ -43,7 +43,7 @@ importers:
'@tailwindcss/forms': 0.5.0_tailwindcss@3.1.6
'@testing-library/dom': 8.11.3
'@testing-library/svelte': 3.1.0_svelte@3.49.0
'@testing-library/user-event': 13.5.0_gzufz4q333be4gqfrvipwvqt6a
'@testing-library/user-event': 13.5.0_@testing-library+dom@8.11.3
autoprefixer: 10.4.4_postcss@8.4.6
babylonjs: 5.18.0
babylonjs-loaders: 5.18.0
@ -56,13 +56,13 @@ importers:
postcss: 8.4.6
postcss-nested: 5.0.6_postcss@8.4.6
prettier: 2.6.2
prettier-plugin-svelte: 2.7.0_3cyj5wbackxvw67rnaarcmbw7y
prettier-plugin-svelte: 2.7.0_prettier@2.6.2+svelte@3.49.0
sirv: 2.0.2
sirv-cli: 2.0.2
svelte: 3.49.0
svelte-check: 2.8.0_mgmdnb6x5rpawk37gozc2sbtta
svelte-check: 2.8.0_postcss@8.4.6+svelte@3.49.0
svelte-i18n: 3.3.13_svelte@3.49.0
svelte-preprocess: 4.10.6_mlkquajfpxs65rn6bdfntu7nmy
svelte-preprocess: 4.10.6_62d50a01257de5eec5be08cad9d3ed66
tailwindcss: 3.1.6
tinyspy: 0.3.0
typescript: 4.7.4
@ -293,7 +293,7 @@ importers:
'@gradio/utils': link:../utils
'@rollup/plugin-json': 5.0.2
plotly.js-dist-min: 2.11.1
svelte-vega: 1.2.0_36sthfwhgi34qytpvkzggbhnle
svelte-vega: 1.2.0_vega-lite@5.6.0+vega@5.22.1
vega: 5.22.1
vega-lite: 5.6.0_vega@5.22.1
@ -415,13 +415,13 @@ importers:
'@gradio/video': link:../video
svelte: 3.49.0
devDependencies:
'@sveltejs/adapter-auto': 1.0.0-next.90
'@sveltejs/adapter-auto': 1.0.0-next.91_@sveltejs+kit@1.0.0-next.318
'@sveltejs/kit': 1.0.0-next.318_svelte@3.49.0
autoprefixer: 10.4.2_postcss@8.4.6
postcss: 8.4.6
postcss-load-config: 3.1.1
svelte-check: 2.4.1_onvlxjpnd23pr3hxbmout2wrjm
svelte-preprocess: 4.10.2_2udzbozq3wemyrf2xz7puuv2zy
svelte-check: 2.4.1_736abba5ed1eb6f8ecf70b1d49ead14b
svelte-preprocess: 4.10.2_d50790bb30dd88cc44babe7efa52bace
tailwindcss: 3.0.23_autoprefixer@10.4.2
tslib: 2.3.1
typescript: 4.5.5
@ -593,9 +593,12 @@ packages:
picomatch: 2.3.1
dev: false
/@sveltejs/adapter-auto/1.0.0-next.90:
resolution: {integrity: sha512-qxH46Oqqn40998wTmnbffONI0HcW/kiZ3OIjZoysjONne+LU4uEsG425MZ2RHDxmR04zxhsdjCAsn6B4du8D7w==}
/@sveltejs/adapter-auto/1.0.0-next.91_@sveltejs+kit@1.0.0-next.318:
resolution: {integrity: sha512-U57tQdzTfFINim8tzZSARC9ztWPzwOoHwNOpGdb2o6XrD0mEQwU9DsII7dBblvzg+xCnmd0pw7PDtXz5c5t96w==}
peerDependencies:
'@sveltejs/kit': ^1.0.0-next.587
dependencies:
'@sveltejs/kit': 1.0.0-next.318_svelte@3.49.0
import-meta-resolve: 2.2.0
dev: true
@ -711,7 +714,7 @@ packages:
svelte: 3.49.0
dev: false
/@testing-library/user-event/13.5.0_gzufz4q333be4gqfrvipwvqt6a:
/@testing-library/user-event/13.5.0_@testing-library+dom@8.11.3:
resolution: {integrity: sha512-5Kwtbo3Y/NowpkbRuSepbyMFkZmHgD+vPzYB/RJ4oxt5Gj/avFFBYjhw27cqSVPVw/3a67NK1PbiIr9k4Gwmdg==}
engines: {node: '>=10', npm: '>=6'}
peerDependencies:
@ -2700,7 +2703,7 @@ packages:
picocolors: 1.0.0
source-map-js: 1.0.2
/prettier-plugin-svelte/2.7.0_3cyj5wbackxvw67rnaarcmbw7y:
/prettier-plugin-svelte/2.7.0_prettier@2.6.2+svelte@3.49.0:
resolution: {integrity: sha512-fQhhZICprZot2IqEyoiUYLTRdumULGRvw0o4dzl5jt0jfzVWdGqeYW27QTWAeXhoupEZJULmNoH3ueJwUWFLIA==}
peerDependencies:
prettier: ^1.16.4 || ^2.0.0
@ -3106,7 +3109,7 @@ packages:
resolution: {integrity: sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==}
engines: {node: '>= 0.4'}
/svelte-check/2.4.1_onvlxjpnd23pr3hxbmout2wrjm:
/svelte-check/2.4.1_736abba5ed1eb6f8ecf70b1d49ead14b:
resolution: {integrity: sha512-xhf3ShP5rnRwBokrgTBJ/0cO9QIc1DAVu1NWNRTfCDsDBNjGmkS3HgitgUadRuoMKj1+irZR/yHJ+Uqobnkbrw==}
hasBin: true
peerDependencies:
@ -3120,7 +3123,7 @@ packages:
sade: 1.8.1
source-map: 0.7.3
svelte: 3.49.0
svelte-preprocess: 4.10.2_2udzbozq3wemyrf2xz7puuv2zy
svelte-preprocess: 4.10.2_d50790bb30dd88cc44babe7efa52bace
typescript: 4.5.5
transitivePeerDependencies:
- '@babel/core'
@ -3135,7 +3138,7 @@ packages:
- sugarss
dev: true
/svelte-check/2.8.0_mgmdnb6x5rpawk37gozc2sbtta:
/svelte-check/2.8.0_postcss@8.4.6+svelte@3.49.0:
resolution: {integrity: sha512-HRL66BxffMAZusqe5I5k26mRWQ+BobGd9Rxm3onh7ZVu0nTk8YTKJ9vu3LVPjUGLU9IX7zS+jmwPVhJYdXJ8vg==}
hasBin: true
peerDependencies:
@ -3148,7 +3151,7 @@ packages:
picocolors: 1.0.0
sade: 1.8.1
svelte: 3.49.0
svelte-preprocess: 4.10.6_mlkquajfpxs65rn6bdfntu7nmy
svelte-preprocess: 4.10.6_62d50a01257de5eec5be08cad9d3ed66
typescript: 4.7.4
transitivePeerDependencies:
- '@babel/core'
@ -3186,7 +3189,7 @@ packages:
tiny-glob: 0.2.9
dev: false
/svelte-preprocess/4.10.2_2udzbozq3wemyrf2xz7puuv2zy:
/svelte-preprocess/4.10.2_d50790bb30dd88cc44babe7efa52bace:
resolution: {integrity: sha512-aPpkCreSo8EL/y8kJSa1trhiX0oyAtTjlNNM7BNjRAsMJ8Yy2LtqHt0zyd4pQPXt+D4PzbO3qTjjio3kwOxDlA==}
engines: {node: '>= 9.11.2'}
requiresBuild: true
@ -3239,7 +3242,7 @@ packages:
typescript: 4.5.5
dev: true
/svelte-preprocess/4.10.6_mlkquajfpxs65rn6bdfntu7nmy:
/svelte-preprocess/4.10.6_62d50a01257de5eec5be08cad9d3ed66:
resolution: {integrity: sha512-I2SV1w/AveMvgIQlUF/ZOO3PYVnhxfcpNyGt8pxpUVhPfyfL/CZBkkw/KPfuFix5FJ9TnnNYMhACK3DtSaYVVQ==}
engines: {node: '>= 9.11.2'}
requiresBuild: true
@ -3295,7 +3298,7 @@ packages:
resolution: {integrity: sha512-VTWHOdwDyWbndGZnI0PQJY9DO7hgQlNubtCcCL6Wlypv5dU4vEsc4A1sX9TWMuvebEe4332SgsQQHzOdZ+guhQ==}
dev: false
/svelte-vega/1.2.0_36sthfwhgi34qytpvkzggbhnle:
/svelte-vega/1.2.0_vega-lite@5.6.0+vega@5.22.1:
resolution: {integrity: sha512-MsDdO+l7o/d9d4mVkh8MBDhqZvJ45lpuprBaTj0V/ZilIG902QERHFQlam3ZFcR9C9OIKSpmPqINssWNPkDdcA==}
peerDependencies:
vega: '*'
@ -3303,7 +3306,7 @@ packages:
dependencies:
fast-deep-equal: 3.1.3
vega: 5.22.1
vega-embed: 6.21.0_36sthfwhgi34qytpvkzggbhnle
vega-embed: 6.21.0_vega-lite@5.6.0+vega@5.22.1
vega-lite: 5.6.0_vega@5.22.1
dev: false
@ -3534,7 +3537,7 @@ packages:
- encoding
dev: false
/vega-embed/6.21.0_36sthfwhgi34qytpvkzggbhnle:
/vega-embed/6.21.0_vega-lite@5.6.0+vega@5.22.1:
resolution: {integrity: sha512-Tzo9VAfgNRb6XpxSFd7uphSeK2w5OxDY2wDtmpsQ+rQlPSEEI9TE6Jsb2nHRLD5J4FrmXKLrTcORqidsNQSXEg==}
peerDependencies:
vega: ^5.21.0
@ -3548,7 +3551,7 @@ packages:
vega-interpreter: 1.0.4
vega-lite: 5.6.0_vega@5.22.1
vega-schema-url-parser: 2.2.0
vega-themes: 2.12.0_36sthfwhgi34qytpvkzggbhnle
vega-themes: 2.12.0_vega-lite@5.6.0+vega@5.22.1
vega-tooltip: 0.28.0
dev: false
bundledDependencies:
@ -3767,7 +3770,7 @@ packages:
d3-array: 3.1.1
dev: false
/vega-themes/2.12.0_36sthfwhgi34qytpvkzggbhnle:
/vega-themes/2.12.0_vega-lite@5.6.0+vega@5.22.1:
resolution: {integrity: sha512-gHNYCzDgexSQDmGzQsxH57OYgFVbAOmvhIYN3MPOvVucyI+zhbUawBVIVNzG9ftucRp0MaaMVXi6ctC5HLnBsg==}
peerDependencies:
vega: '*'

View File

@ -1,7 +1,6 @@
import os
from gradio.documentation import generate_documentation, document_cls
import gradio.templates
from gradio.events import Changeable, Clearable, Submittable, Editable, Playable, Clickable, Blurrable, Uploadable
from gradio.events import EventListener
from ..guides import guides
DIR = os.path.dirname(__file__)
@ -10,20 +9,34 @@ TEMPLATE_FILE = os.path.join(DIR, "template.html")
DEMOS_DIR = os.path.join(GRADIO_DIR, "demo")
docs = generate_documentation()
def add_component_shortcuts():
for component in docs["component"]:
if not getattr(component["class"], "allow_string_shortcut", True):
continue
component["string_shortcuts"] = [
(component["class"].__name__, component["name"].lower(), "Uses default values")
(
component["class"].__name__,
component["name"].lower(),
"Uses default values",
)
]
for subcls in component["class"].__subclasses__():
if getattr(subcls, "is_template", False):
_, tags, _ = document_cls(subcls)
component["string_shortcuts"].append(
(subcls.__name__, subcls.__name__.lower(), "Uses " + tags.get("sets", "default values"))
(
subcls.__name__,
subcls.__name__.lower(),
"Uses " + tags.get("sets", "default values"),
)
)
add_component_shortcuts()
def add_demos():
for mode in docs:
for cls in docs[mode]:
@ -37,64 +50,76 @@ def add_demos():
demo_code = run_py.read()
cls["demos"].append((demo, demo_code))
add_demos()
def add_supported_events():
for component in docs["component"]:
component["events"] = []
if issubclass(component["class"], Changeable):
component["events"].append("change()")
if issubclass(component["class"], Clickable):
component["events"].append("click()")
if issubclass(component["class"], Clearable):
component["events"].append("clear()")
if issubclass(component["class"], Playable):
component["events"].append("play()")
component["events"].append("pause()")
component["events"].append("stop()")
if issubclass(component["class"], Editable):
component["events"].append("edit()")
if issubclass(component["class"], Submittable):
component["events"].append("submit()")
if issubclass(component["class"], Blurrable):
component["events"].append("blur()")
if issubclass(component["class"], Uploadable):
component["events"].append("upload()")
event_listener_props = dir(EventListener)
for listener in EventListener.__subclasses__():
if not issubclass(component["class"], listener):
continue
for prop in dir(listener):
if prop not in event_listener_props:
component["events"].append(prop + "()")
if component["events"]:
component["events"] = ", ".join(component["events"])
add_supported_events()
def add_guides():
for mode in docs:
for cls in docs[mode]:
if "guides" not in cls["tags"]:
continue
cls["guides"] = []
docstring_guides = [guide.strip() for guide in cls["tags"]["guides"].split(",")]
docstring_guides = [
guide.strip() for guide in cls["tags"]["guides"].split(",")
]
for docstring_guide in docstring_guides:
for guide in guides:
if docstring_guide == guide["name"]:
cls["guides"].append(guide)
add_guides()
def style_types():
for mode in docs:
for cls in docs[mode]:
for tag in ["preprocessing", "postprocessing", "examples-format", "examples-format", "events"]:
for tag in [
"preprocessing",
"postprocessing",
"examples-format",
"events",
]:
if tag not in cls["tags"]:
continue
cls["tags"][tag] = cls["tags"][tag].replace("{", "<span class='text-orange-500' style='font-family: monospace; font-size: large;' >").replace("}", "</span>")
cls["tags"][tag] = (
cls["tags"][tag]
.replace(
"{",
"<span class='text-orange-500' style='font-family: monospace; font-size: large;' >",
)
.replace("}", "</span>")
)
style_types()
def override_signature(name, signature):
for mode in docs:
for cls in docs[mode]:
if cls["name"] == name:
cls["override_signature"] = signature
override_signature("Blocks", "with gradio.Blocks():")
override_signature("Row", "with gradio.Row():")
override_signature("Column", "with gradio.Column():")
@ -111,10 +136,17 @@ def find_cls(target_cls):
return cls
raise ValueError("Class not found")
def build(output_dir, jinja_env, gradio_wheel_url, gradio_version):
os.makedirs(output_dir, exist_ok=True)
template = jinja_env.get_template("docs/template.html")
output = template.render(docs=docs, find_cls=find_cls, version="main", gradio_version=gradio_version, gradio_wheel_url=gradio_wheel_url)
output = template.render(
docs=docs,
find_cls=find_cls,
version="main",
gradio_version=gradio_version,
gradio_wheel_url=gradio_wheel_url,
)
output_folder = os.path.join(output_dir, "docs")
os.makedirs(output_folder)
output_main = os.path.join(output_folder, "main")
@ -128,9 +160,12 @@ def build(output_dir, jinja_env, gradio_wheel_url, gradio_version):
with open(version_docs_file, "w") as index_html:
index_html.write(output)
def build_pip_template(version, jinja_env):
docs_files = os.listdir("src/docs")
template = jinja_env.get_template("docs/template.html")
output = template.render(docs=docs, find_cls=find_cls, version="pip", gradio_version=version)
output = template.render(
docs=docs, find_cls=find_cls, version="pip", gradio_version=version
)
with open(f"src/docs/v{version}_template.html", "w+") as template_file:
template_file.write(output)

View File

@ -63,7 +63,7 @@
{% if obj['override_signature'] %}
<div class="codeblock"><pre><code class="lang-python">{{ obj['override_signature'] }}</code></pre></div>
{% else %}
<div class="codeblock"><pre><code class="lang-python">{{ parent }}.<span>{{ obj["name"] }}(</span><!--
<div class="codeblock"><pre><code class="lang-python">{{ parent }}{% if obj["name"] != "__call__" %}.<span>{{ obj["name"] }}{% endif %}(</span><!--
-->{% for param in obj["parameters"] %}<!--
-->{% if "kwargs" not in param and "default" not in param and param["name"] != "self" %}<!--
-->{{ param["name"] }}, <!--
@ -86,7 +86,7 @@
<p class="mb-2 text-lg text-gray-500"> <span class="text-orange-500">As input: </span> {{ obj["tags"]["preprocessing"] }}</p>
<p class="mb-2 text-lg text-gray-500"> <span class="text-orange-500">As output:</span> {{ obj["tags"]["postprocessing"] }}</p>
{% if "examples-format" in obj["tags"] %}
<p class="text-lg text-gray-500"> <span class="text-orange-500">Format expected for examples:</span> {{ obj["tags"]["examples-format"] }}</p>
<p class="mb-2 text-lg text-gray-500"> <span class="text-orange-500">Format expected for examples:</span> {{ obj["tags"]["examples-format"] }}</p>
{% endif %}
{% if obj["events"]|length > 0 %}
<p class="text-lg text-gray-500"><span class="text-orange-500">Supported events:</span> <em>{{ obj["events"] }}</em></p>
@ -100,7 +100,7 @@
</div>
{% if obj["demos"] %}
<button
class="h-2/4 rounded-full bg-orange-500 hover:bg-orange-400 transition-colors text-white py-1 px-3 my-2 mx-2"
class="rounded-full bg-orange-500 hover:bg-orange-400 transition-colors text-white py-1 px-3 my-2 mx-2"
onclick="this.closest('.obj').querySelector('.demo-window').classList.remove('hidden'); load_gradio('{{ gradio_js }}');"
>
More Examples →

View File

@ -13,9 +13,11 @@
<a href="/">
<img src="/assets/gradio.svg">
</a>
<div class="pl-2 pr-1 py-1 w-fit text-gray-600 font-semibold">
<select id="toggle" onchange="this.options[this.selectedIndex].value && (window.location = this.options[this.selectedIndex].value);" class="form-select appearance-none block px-3 text-base font-normal text-gray-700 bg-white bg-clip-padding bg-no-repeat border border-solid border-gray-300 rounded-full transition ease-in-out m-0 focus:text-gray-700 focus:bg-white focus:border-blue-600 focus:outline-none w-max" style="width: 135%; padding-top: 0.125rem; padding-bottom: 0.125rem">
<select id="toggle"
onchange="this.options[this.selectedIndex].value && (window.location = this.options[this.selectedIndex].value);"
class="form-select appearance-none block px-3 text-base font-normal text-gray-700 bg-white bg-clip-padding bg-no-repeat border border-solid border-gray-300 rounded-full transition ease-in-out m-0 focus:text-gray-700 focus:bg-white focus:border-blue-600 focus:outline-none w-max"
style="width: 135%; padding-top: 0.125rem; padding-bottom: 0.125rem">
<option value="/docs" {% if version=='pip' %}selected{% endif %}>
v{{ gradio_version }}
</option>
@ -24,10 +26,10 @@
</option>
</select>
</div>
</div>
<svg class="h-8 w-8 lg:hidden" viewBox="-10 -10 20 20" onclick="document.querySelector(&quot;nav&quot;).classList.toggle(&quot;hidden&quot;),document.querySelector(&quot;nav&quot;).classList.toggle(&quot;flex&quot;)">
<svg class="h-8 w-8 lg:hidden"
viewBox="-10 -10 20 20"
onclick="document.querySelector(&quot;nav&quot;).classList.toggle(&quot;hidden&quot;),document.querySelector(&quot;nav&quot;).classList.toggle(&quot;flex&quot;)">
<rect x="-7" y="-6" width="14" height="2"></rect>
<rect x="-7" y="-1" width="14" height="2"></rect>
<rect x="-7" y="4" width="14" height="2"></rect>
@ -39,27 +41,39 @@
</a>
<a class="thin-link flex items-center gap-3" href="/guides"><span>💡</span> <span>Guides</span></a>
<a class="thin-link flex items-center gap-3" href="/demos"><span>🎢</span> <span>Demos</span></a>
<div class="group relative flex cursor-pointer items-center gap-3" onclick="document.querySelector(&quot;.help-menu&quot;).classList.toggle(&quot;flex&quot;),document.querySelector(&quot;.help-menu&quot;).classList.toggle(&quot;hidden&quot;)">
<div class="group relative flex cursor-pointer items-center gap-3"
onclick="document.querySelector(&quot;.help-menu&quot;).classList.toggle(&quot;flex&quot;),document.querySelector(&quot;.help-menu&quot;).classList.toggle(&quot;hidden&quot;)">
<span>🖐</span> <span>Community</span>
<svg class="h-4 w-4" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20">
<svg class="h-4 w-4"
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 20 20">
<path d="M9.293 12.95l.707.707L15.657 8l-1.414-1.414L10 10.828 5.757 6.586 4.343 8z"></path>
</svg>
<div class="help-menu absolute top-6 hidden w-52 flex-col bg-white shadow group-hover:flex sm:right-0">
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100" href="https://github.com/gradio-app/gradio/issues/new/choose" target="_blank">File an Issue</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100" href="https://discuss.huggingface.co/c/gradio/26" target="_blank">Discuss</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100" target="_blank" href="https://discord.gg/feTf9x3ZSB">Discord</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100" target="_blank" href="https://gradio.curated.co/">Newsletter</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100"
href="https://github.com/gradio-app/gradio/issues/new/choose"
target="_blank">File an Issue</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100"
href="https://discuss.huggingface.co/c/gradio/26"
target="_blank">Discuss</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100"
target="_blank"
href="https://discord.gg/feTf9x3ZSB">Discord</a>
<a class="thin-link inline-block px-4 py-2 hover:bg-gray-100"
target="_blank"
href="https://gradio.curated.co/">Newsletter</a>
</div>
</div>
<a class="thin-link flex items-center gap-3" href="https://github.com/gradio-app/gradio">
<a class="thin-link flex items-center gap-3"
href="https://github.com/gradio-app/gradio">
<img src="/assets/img/github-black.svg" class="w-6">
</a>
</nav>
</div>
<main class="container mx-auto px-4 flex gap-4">
</div>
<main class="container mx-auto px-4 flex gap-4">
<div class="navigation pb-4 h-screen leading-relaxed sticky top-0 text-md overflow-y-auto hidden lg:block rounded-t-xl bg-gradient-to-r from-white to-gray-50 overflow-x-clip"
style="width: 16.666667%; min-width: 16.666667%;">
style="width: 16.666667%;
min-width: 16.666667%;">
<div class="w-full sticky top-0 bg-gradient-to-r from-white to-gray-50 z-10">
<input id="search"
type="search"
@ -75,7 +89,8 @@
{% with obj=find_cls("Interface") %}
{% if "fns" in obj and obj["fns"]|length %}
{% for fn in obj["fns"] %}
<a class="thinner-link px-4 pl-8 block" href="#interface-{{ fn['name'].lower()}}">{{ fn["name"] }}</a>
<a class="thinner-link px-4 pl-8 block"
href="#interface-{{ fn['name'].lower()}}">{{ fn["name"] }}</a>
{% endfor %}
{% endif %}
{% endwith %}
@ -99,7 +114,8 @@
{% with obj=find_cls("Blocks") %}
{% if "fns" in obj and obj["fns"]|length %}
{% for fn in obj["fns"] %}
<a class="thinner-link px-4 pl-8 block" href="#blocks-{{ fn['name'].lower()}}">{{ fn["name"] }}</a>
<a class="thinner-link px-4 pl-8 block"
href="#blocks-{{ fn['name'].lower()}}">{{ fn["name"] }}</a>
{% endfor %}
{% endif %}
{% endwith %}
@ -116,8 +132,8 @@
{% for component in docs["component"] %}
<a class="px-4 block thin-link" href="#{{ component['name'].lower() }}">{{ component['name'] }}</a>
{% endfor %}
<a class="link px-4 my-2 block" href="#component-helpers">Component Helpers
{% for component in docs["component-helpers"] %}
<a class="link px-4 my-2 block" href="#helpers">Helpers
{% for component in docs["helpers"] %}
<a class="px-4 block thin-link" href="#{{ component['name'].lower() }}">{{ component['name'] }}</a>
{% endfor %}
<a class="link px-4 my-2 block" href="#routes">Routes
@ -146,13 +162,18 @@
{% endif %}
<section id="building_demos" class="pt-2 flex flex-col gap-10 mb-8">
<section class="pt-2">
<h2 class="text-4xl font-light mb-2 pt-2 text-orange-500" id="building-demos">Building Demos</h2>
<h2 class="text-4xl font-light mb-2 pt-2 text-orange-500"
id="building-demos">
Building Demos
</h2>
{% with obj=find_cls("Interface"), is_class=True, parent="gradio" %}
{% include "docs/obj_doc_template.html" %}
{% endwith %}
</section>
<section id="flagging" class="pt-2">
<h3 class="text-4xl font-light my-4" id="flagging">Flagging</h3>
<h3 class="text-4xl font-light my-4" id="flagging">
Flagging
</h3>
<p class="mt-8 mb-12 text-lg">
A Gradio Interface includes a "Flag" button that appears
underneath the output. By default, clicking on the Flag button sends the input and output
@ -163,7 +184,6 @@
that are listed below, or you can create your own, which lets you do whatever
you want with the data that is being flagged.
</p>
<div class="flex flex-col gap-10">
{% for flagging_callback in docs["flagging"] %}
{% with obj=flagging_callback, is_class=True, parent="gradio" %}
@ -172,16 +192,16 @@
{% endfor %}
</div>
</section>
<section id="combining-interfaces" class="pt-2">
<h3 class="text-4xl font-light my-4">Combining Interfaces</h3>
<h3 class="text-4xl font-light my-4">
Combining Interfaces
</h3>
<p class="mt-8 mb-12 text-lg">
Once you have created several Interfaces, we provide several classes that let you
start combining them together. For example, you can chain them in <em>Series</em>
or compare their outputs in <em>Parallel</em> if the inputs and outputs match accordingly.
You can also display arbitrary Interfaces together in a tabbed layout using <em>TabbedInterface</em>.
</p>
<div class="flex flex-col gap-10">
{% with obj=find_cls("TabbedInterface"), parent="gradio" %}
{% include "docs/obj_doc_template.html" %}
@ -202,8 +222,12 @@
</div>
</section>
<section id="block-layouts" class="pt-2 mb-8">
<h3 class="text-3xl font-light my-4">Block Layouts</h3>
<p class="mb-12">Customize the layout of your Blocks UI with the layout classes below.</p>
<h3 class="text-3xl font-light my-4">
Block Layouts
</h3>
<p class="mb-12">
Customize the layout of your Blocks UI with the layout classes below.
</p>
<div class="flex flex-col gap-10">
{% for layout in docs["layout"] %}
{% with obj=layout, is_class=True, parent="gradio" %}
@ -213,10 +237,12 @@
</div>
</section>
</section>
<section id="components" class="pt-2 flex flex-col gap-10">
<section id="components" class="pt-2 flex flex-col gap-10 mb-8">
<div>
<h2 id="components-header"
class="text-4xl font-light mb-2 pt-2 text-orange-500">Components</h2>
class="text-4xl font-light mb-2 pt-2 text-orange-500">
Components
</h2>
<p class="mt-8 text-lg">
Gradio includes pre-built components that can be used as
inputs or outputs in your Interface or Blocks with a single line of code. Components
@ -237,27 +263,29 @@
{% endwith %}
{% endfor %}
</section>
<section id="component-helpers" class="pt-2 flex flex-col gap-10">
<section id="helpers" class="pt-2 flex flex-col gap-10 mb-8">
<div>
<h2 id="component-helpers-header"
class="text-4xl font-light mb-2 pt-2 text-orange-500">Components Helpers</h2>
<h2 id="helpers-header"
class="text-4xl font-light mb-2 pt-2 text-orange-500">
Helpers
</h2>
<p class="mt-8 text-lg">
Gradio includes helper classes that abstract over existing components. The goal of these classes is to help you
add common functionality to your app without having to repeatedly create the same components and event listeners.
Gradio includes helper classes and methods that interact with existing components. The goal of these classes and methods is to help you
add common functionality to your app without having to rewrite common functions.
</p>
</div>
<section class="pt-2">
{% for component in docs["component-helpers"] %}
{% for component in docs["helpers"] %}
{% with obj=component, is_class=True, parent="gradio" %}
{% include "docs/obj_doc_template.html" %}
{% endwith %}
{% endfor %}
</section>
</section>
<section id="routes" class="pt-2 flex flex-col gap-10">
<section id="routes" class="pt-2 flex flex-col gap-10 mb-8">
<div>
<h2 id="routes-header"
class="text-4xl font-light mb-2 pt-2 text-orange-500">Routes</h2>
class="text-4xl font-light mb-2 pt-2 text-orange-500">
Routes
</h2>
<p class="mt-8 text-lg">
Gradio includes some helper functions for exposing and interacting with the FastAPI app
used to run your demo.
@ -271,7 +299,6 @@
</section>
</div>
</main>
<script>
let isMobile = window.matchMedia("only screen and (max-width: 480px)").matches;
@ -296,18 +323,14 @@
{% endif %}
</script>
<script src="/assets/prism.js"></script>
<script>
window.__gradio_mode__ = "website";
</script>
{% include 'templates/footer.html' %}
<script>{% include 'templates/guide-color.js' %}</script>
<script>{% include 'templates/add_anchors.js' %}</script>
<script>{% include 'templates/add_copy.js' %}</script>
<script>
const show_demo = (component, demo) => {
document.querySelectorAll(`#${component} .demo-btn.selected-demo`).forEach(n => n.classList.remove('selected-demo'));
@ -406,6 +429,5 @@
})
</script>
</body>
</html>