In this gist we are simply going to containerize a simple Bento Service with pre-packaged models.
- You have already created a Bento Service locally and have tested it. Refer to this Gist if you need to get started on creating Bento Services
- You have docker installed locally
From the root of the directory containing your Bento File (bentofile.yaml
) and the Bento Service (in most cases service.py
) run the following command:
Refer to official docs for more information.
cd prompt-enhancer && cat bentofile.yaml
As you can see that the Bento Definition has the model defined. So when the image is generated our container will have the models packaged within them
bentoml build
bentoml containerize prompt_enhancer:<generated-image-tag>
This will build an OCI-Compliant image that has very few vulnerabilities. You don't even need to worry about Dockerfile
or image generation and versioning. BentoML takes care of it using the Model Management API.
Once the above commands complete you can then run docker image ls
to see the newly generated image. Go ahead and tag it as necessary and push it to your private registries.