Dev Requirements

- create test/requirements.in
- create scripts folder, move and create scripts under it
- update CONTRIBUTING.md
This commit is contained in:
Ömer Faruk Özdemir 2022-01-24 17:33:01 +03:00
parent 0c673aaa1b
commit 1d09c9970a
8 changed files with 472 additions and 15 deletions

View File

@ -1,41 +1,49 @@
# Contributing to Gradio
Prequisites:
* Python 3.7+
* Node 16.0+ (optional for backend-only changes, but needed for any frontend changes)
More than 30 awesome developers have contributed to the `gradio` library, and we'd be thrilled if you would like be the next `gradio` contributor! You can start by forking or cloning the repo (https://github.com/gradio-app/gradio.git) and creating your own branch to work from.
* Python 3.7+
* Node 16.0+ (optional for backend-only changes, but needed for any frontend changes)
More than 30 awesome developers have contributed to the `gradio` library, and we'd be thrilled if you would like be the next `gradio` contributor! You can start by forking or cloning the
repo (https://github.com/gradio-app/gradio.git) and creating your own branch to work from.
Next, to install the local development version of Gradio:
* Navigate to the `/gradio` subfolder and run `pip install -e .`.
* Navigate to the repo folder and run
```bash
bash scripts/install_test_requirements.sh
```
When installing locally, you may also need to build the front end:
* Navigate to the `/frontend` subfolder and run `npm install`.
* Then run `npm run build` (or `npm run build:win` on Windows).
* Then you can run `npm run start` to start a local development server (on port 3000 by default) that responds to any changes in the frontend
* Navigate to the `/frontend` subfolder and run `npm install`.
* Then run `npm run build` (or `npm run build:win` on Windows).
* Then you can run `npm run start` to start a local development server (on port 3000 by default) that responds to any changes in the frontend
### Structure of the Repository
It's helpful to know the overall structure of the repository so that you can focus on the part of the source code you'd like to contribute to
* `/gradio`: contains the Python source code for the library
* `/gradio/interface.py`: contains the Python source code for the core `Interface` class (**start HERE!**)
* `/gradio/interface.py`: contains the Python source code for the core `Interface` class (**start HERE!**)
* `/frontend`: contains the HTML/JS/CSS source code for the library
* `/test`: contains Python unit tests for the library
* `/demo`: contains demos that are used in the documentation as well as for integration tests
* `/website`: contains the code for the Gradio website (www.gradio.app). See the README in the `/website` folder for more details
### Continuous Integration and Testing
All PRs must pass the continuous integration tests before merging. To test locally, you can run `python -m unittest` from `/` (the directory where you cloned this repo).
## Submitting PRs
All PRs should be against `master`. Direct commits to master are blocked, and PRs require an approving review
to merge into master. By convention, the Gradio maintainers will review PRs when:
* An initial review has been requested, and
* A maintainer (@abidlabs, @aliabid94, @aliabd, @AK391, or @dawoodkhan82) is tagged in the PR comments and asked to complete a review
We ask that you make sure initial CI checks are passing before requesting a review.
One of the Gradio maintainers will merge the PR when all the checks are passing.
All PRs should be against `master`. Direct commits to master are blocked, and PRs require an approving review to merge into master. By convention, the Gradio maintainers will review PRs when:
* An initial review has been requested, and
* A maintainer (@abidlabs, @aliabid94, @aliabd, @AK391, or @dawoodkhan82) is tagged in the PR comments and asked to complete a review
We ask that you make sure initial CI checks are passing before requesting a review. One of the Gradio maintainers will merge the PR when all the checks are passing.
*Could these guidelines be clearer? Feel free to open a PR to help us faciltiate open-source contributions!*

View File

@ -0,0 +1,4 @@
# Create test requirements under test/requirements.txt using requirements.in
cd test
pip install --upgrade pip-tools
pip-compile

View File

@ -0,0 +1 @@
pip install -e .

View File

@ -0,0 +1,2 @@
pip install -r test/requirements.txt
pip install -e .

1
scripts/run_tests.sh Normal file
View File

@ -0,0 +1 @@
python -m unittest

13
test/requirements.in Normal file
View File

@ -0,0 +1,13 @@
comet_ml
mlflow
scikit-image
selenium==4.0.0a6.post2
setuptools
transformers
wandb
webunit
tensorflow
shap
IPython
coverage
python_org

428
test/requirements.txt Normal file
View File

@ -0,0 +1,428 @@
#
# This file is autogenerated by pip-compile with python 3.9
# To update, run:
#
# pip-compile
#
absl-py==1.0.0
# via
# tensorboard
# tensorflow
alembic==1.7.5
# via mlflow
appnope==0.1.2
# via ipython
asttokens==2.0.5
# via stack-data
astunparse==1.6.3
# via tensorflow
attrs==21.4.0
# via jsonschema
backcall==0.2.0
# via ipython
black==21.12b0
# via ipython
cachetools==5.0.0
# via google-auth
certifi==2021.10.8
# via
# dulwich
# requests
# sentry-sdk
# urllib3
cffi==1.15.0
# via cryptography
charset-normalizer==2.0.10
# via requests
click==8.0.3
# via
# black
# databricks-cli
# flask
# mlflow
# sacremoses
# wandb
cloudpickle==2.0.0
# via
# mlflow
# shap
comet-ml==3.24.2
# via -r requirements.in
configobj==5.0.6
# via everett
configparser==5.2.0
# via wandb
coverage==6.2
# via -r requirements.in
cryptography==36.0.1
# via
# pyopenssl
# urllib3
databricks-cli==0.16.2
# via mlflow
decorator==5.1.1
# via ipython
docker==5.0.3
# via mlflow
docker-pycreds==0.4.0
# via wandb
dulwich==0.20.30
# via comet-ml
entrypoints==0.3
# via mlflow
everett[ini]==3.0.0
# via comet-ml
executing==0.8.2
# via stack-data
filelock==3.4.2
# via
# huggingface-hub
# transformers
flask==2.0.2
# via
# mlflow
# prometheus-flask-exporter
flatbuffers==2.0
# via tensorflow
gast==0.4.0
# via tensorflow
gitdb==4.0.9
# via gitpython
gitpython==3.1.26
# via
# mlflow
# wandb
google-auth==2.4.0
# via
# google-auth-oauthlib
# tensorboard
google-auth-oauthlib==0.4.6
# via tensorboard
google-pasta==0.2.0
# via tensorflow
greenlet==1.1.2
# via sqlalchemy
grpcio==1.43.0
# via
# tensorboard
# tensorflow
gunicorn==20.1.0
# via mlflow
h5py==3.6.0
# via tensorflow
huggingface-hub==0.4.0
# via transformers
idna==3.3
# via
# requests
# urllib3
imageio==2.14.1
# via scikit-image
importlib-metadata==4.10.1
# via
# markdown
# mlflow
ipython==8.0.1
# via -r requirements.in
itsdangerous==2.0.1
# via flask
jedi==0.18.1
# via ipython
jinja2==3.0.3
# via flask
joblib==1.1.0
# via
# sacremoses
# scikit-learn
jsonschema==4.4.0
# via comet-ml
keras==2.7.0
# via tensorflow
keras-preprocessing==1.1.2
# via tensorflow
libclang==12.0.0
# via tensorflow
llvmlite==0.38.0
# via numba
mako==1.1.6
# via alembic
markdown==3.3.6
# via tensorboard
markupsafe==2.0.1
# via
# jinja2
# mako
matplotlib-inline==0.1.3
# via ipython
mlflow==1.23.0
# via -r requirements.in
mypy-extensions==0.4.3
# via black
networkx==2.6.3
# via scikit-image
numba==0.55.0
# via shap
numpy==1.21.5
# via
# h5py
# imageio
# keras-preprocessing
# mlflow
# numba
# opt-einsum
# pandas
# pywavelets
# scikit-image
# scikit-learn
# scipy
# shap
# tensorboard
# tensorflow
# tifffile
# transformers
nvidia-ml-py3==7.352.0
# via comet-ml
oauthlib==3.1.1
# via requests-oauthlib
opt-einsum==3.3.0
# via tensorflow
packaging==21.3
# via
# huggingface-hub
# mlflow
# scikit-image
# shap
# transformers
pandas==1.3.5
# via
# mlflow
# shap
parso==0.8.3
# via jedi
pathspec==0.9.0
# via black
pathtools==0.1.2
# via wandb
pexpect==4.8.0
# via ipython
pickleshare==0.7.5
# via ipython
pillow==9.0.0
# via
# imageio
# scikit-image
platformdirs==2.4.1
# via black
prometheus-client==0.12.0
# via prometheus-flask-exporter
prometheus-flask-exporter==0.18.7
# via mlflow
promise==2.3
# via wandb
prompt-toolkit==3.0.24
# via ipython
protobuf==3.19.3
# via
# mlflow
# tensorboard
# tensorflow
# wandb
psutil==5.9.0
# via wandb
ptyprocess==0.7.0
# via pexpect
pure-eval==0.2.2
# via stack-data
pyasn1==0.4.8
# via
# pyasn1-modules
# rsa
pyasn1-modules==0.2.8
# via google-auth
pycparser==2.21
# via cffi
pygments==2.11.2
# via ipython
pyopenssl==21.0.0
# via urllib3
pyparsing==3.0.7
# via packaging
pyrsistent==0.18.1
# via jsonschema
python-dateutil==2.8.2
# via
# pandas
# wandb
pytz==2021.3
# via
# mlflow
# pandas
pywavelets==1.2.0
# via scikit-image
pyyaml==6.0
# via
# huggingface-hub
# mlflow
# transformers
# wandb
querystring-parser==1.2.4
# via mlflow
regex==2022.1.18
# via
# sacremoses
# transformers
requests==2.27.1
# via
# comet-ml
# databricks-cli
# docker
# huggingface-hub
# mlflow
# requests-oauthlib
# requests-toolbelt
# tensorboard
# transformers
# wandb
requests-oauthlib==1.3.0
# via google-auth-oauthlib
requests-toolbelt==0.9.1
# via comet-ml
rsa==4.8
# via google-auth
sacremoses==0.0.47
# via transformers
scikit-image==0.19.1
# via -r requirements.in
scikit-learn==1.0.2
# via shap
scipy==1.7.3
# via
# mlflow
# scikit-image
# scikit-learn
# shap
selenium==4.0.0a6.post2
# via -r requirements.in
semantic-version==2.8.5
# via comet-ml
sentry-sdk==1.5.3
# via wandb
shap==0.40.0
# via -r requirements.in
shortuuid==1.0.8
# via wandb
six==1.16.0
# via
# absl-py
# asttokens
# astunparse
# comet-ml
# configobj
# databricks-cli
# docker-pycreds
# google-auth
# google-pasta
# grpcio
# keras-preprocessing
# promise
# pyopenssl
# python-dateutil
# querystring-parser
# sacremoses
# tensorflow
# wandb
slicer==0.0.7
# via shap
smmap==5.0.0
# via gitdb
sqlalchemy==1.4.31
# via
# alembic
# mlflow
sqlparse==0.4.2
# via mlflow
stack-data==0.1.4
# via ipython
subprocess32==3.5.4
# via wandb
tabulate==0.8.9
# via databricks-cli
tensorboard==2.8.0
# via tensorflow
tensorboard-data-server==0.6.1
# via tensorboard
tensorboard-plugin-wit==1.8.1
# via tensorboard
tensorflow==2.7.0
# via -r requirements.in
tensorflow-estimator==2.7.0
# via tensorflow
tensorflow-io-gcs-filesystem==0.23.1
# via tensorflow
termcolor==1.1.0
# via
# tensorflow
# yaspin
threadpoolctl==3.0.0
# via scikit-learn
tifffile==2021.11.2
# via scikit-image
tokenizers==0.10.3
# via transformers
tomli==1.2.3
# via black
tqdm==4.62.3
# via
# huggingface-hub
# sacremoses
# shap
# transformers
traitlets==5.1.1
# via
# ipython
# matplotlib-inline
transformers==4.15.0
# via -r requirements.in
typing-extensions==3.10.0.2
# via
# black
# huggingface-hub
# tensorflow
urllib3[secure]==1.26.8
# via
# dulwich
# requests
# selenium
# sentry-sdk
wandb==0.12.9
# via -r requirements.in
wcwidth==0.2.5
# via prompt-toolkit
websocket-client==1.2.3
# via
# comet-ml
# docker
webunit==1.3.10
# via -r requirements.in
werkzeug==2.0.2
# via
# flask
# tensorboard
wheel==0.37.1
# via
# astunparse
# tensorboard
# tensorflow
wrapt==1.13.3
# via
# comet-ml
# tensorflow
wurlitzer==3.0.2
# via comet-ml
yaspin==2.1.0
# via wandb
zipp==3.7.0
# via importlib-metadata
# The following packages are considered to be unsafe in a requirements file:
# setuptools