Skip to content

Instantly share code, notes, and snippets.

@abasu0713
Last active July 21, 2024 21:59
Show Gist options
  • Save abasu0713/24b01b7763dc30958f900df615717b8f to your computer and use it in GitHub Desktop.
Save abasu0713/24b01b7763dc30958f900df615717b8f to your computer and use it in GitHub Desktop.
Containerize BentoML applications for deployment in K8s

Containerize BentoML applications for deployment in K8s

In this gist we are simply going to containerize a simple Bento Service with pre-packaged models.

Prerequisites

  1. You have already created a Bento Service locally and have tested it. Refer to this Gist if you need to get started on creating Bento Services
  2. You have docker installed locally

Step 1: Containerize BentoML service

From the root of the directory containing your Bento File (bentofile.yaml) and the Bento Service (in most cases service.py) run the following command: Refer to official docs for more information.

cd prompt-enhancer && cat bentofile.yaml

Screenshot from 2024-07-21 13-19-31 As you can see that the Bento Definition has the model defined. So when the image is generated our container will have the models packaged within them

bentoml build
bentoml containerize prompt_enhancer:<generated-image-tag>

This will build an OCI-Compliant image that has very few vulnerabilities. You don't even need to worry about Dockerfile or image generation and versioning. BentoML takes care of it using the Model Management API. Once the above commands complete you can then run docker image ls to see the newly generated image. Go ahead and tag it as necessary and push it to your private registries. Screenshot from 2024-07-21 13-20-58 Screenshot from 2024-07-21 13-21-29 Screenshot from 2024-07-21 13-22-00 Screenshot from 2024-07-21 16-58-10

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment