Skip to content

bentoml/quickstart

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Quickstart

This quickstart demonstrates how to build a text summarization application with a Transformer model from the Hugging Face Model Hub.

Prerequisites

Python 3.8+ and pip installed. See the Python downloads page to learn more.

Get started

Perform the following steps to run this project and deploy it to BentoCloud.

  1. Clone the repository:

    git clone https://github.com/bentoml/quickstart.git
    cd quickstart
  2. Install the required dependencies:

    pip install -r requirements.txt
  3. Serve your model as an HTTP server. This starts a local server at http://localhost:3000, making your model accessible as a web service.

    bentoml serve .
  4. Once your Service is ready, you can deploy it to BentoCloud. Make sure you have logged in to BentoCloud and run the following command to deploy it.

    bentoml deploy .
    

    Note: Alternatively, you can manually build a Bento, containerize it with Docker, and deploy it in any Docker-compatible environment.

For more information, see Quickstart in the BentoML documentation.

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages