Skip to content

Instantly share code, notes, and snippets.

@AK391
Last active April 25, 2022 16:18
Show Gist options
  • Save AK391/c4b52e96f6ab127aa604d8ebe509f322 to your computer and use it in GitHub Desktop.
Save AK391/c4b52e96f6ab127aa604d8ebe509f322 to your computer and use it in GitHub Desktop.
# blog: https://huggingface.co/blog/gradio-spaces
# It's so easy to demonstrate a Machine Learning project thanks to Gradio.
# In this blog post, we'll walk you through:
# the recent Gradio integration that helps you demo models from the Hub seamlessly with few lines of code leveraging the Inference API.
# how to use Hugging Face Spaces to host demos of your own models.
# Hugging Face Hub Integration in Gradio
# You can demonstrate your models in the Hub easily. You only need to define the Interface that includes:
# The repository ID of the model you want to infer with
# A description and title
# Example inputs to guide your audience
# After defining your Interface, just call .launch() and your demo will start running. You can do this in Colab, but if you want to share it with the community a great option is to use Spaces!
# Spaces are a simple, free way to host your ML demo apps in Python. To do so, you can create a repository at https://huggingface.co/new-space and select Gradio as the SDK. Once done, you can create a file called app.py, copy the code below, and your app will be up and running in a few seconds!
# pip install gradio before running the code below
# example for mGPT: Few-Shot Learners Go Multilingual (https://huggingface.co/sberbank-ai/mGPT) to load and launch a gradio demo in a few lines of code
import gradio as gr
gr.Interface.load("huggingface/sberbank-ai/mGPT").launch()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment