About this page

This is an API reference for using FastAI in BentoML. Please refer to fastai guide for more information about how to use FastAI in BentoML.

bentoml.fastai.save_model(name: Tag | str, learner_: learner.Learner, *, signatures: ModelSignaturesType | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: t.List[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model[source]#

Save a fastai.learner.Learner model instance to the BentoML model store.

If the save_model() method failed while saving a given learner, your learner may contain a Callback that is not picklable. All FastAI callbacks are stateful, which makes some of them not picklable. Use Learner.remove_cbs() to remove unpicklable callbacks.

  • name – The name to give to the model in the BentoML store. This must be a valid Tag name.

  • learnerLearner to be saved.

  • signatures – Signatures of predict methods to be used. If not provided, the signatures default to predict. See ModelSignature for more details.

  • labels – A default set of management labels to be associated with the model. An example is {"training-set": "data-1"}.

  • custom_objects – Custom objects to be saved with the model. An example is {"my-normalizer": normalizer}. Custom objects are currently serialized with cloudpickle, but this implementation is subject to change.

  • external_modules (List[ModuleType], optional, default to None) – user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration module

  • metadata – Metadata to be associated with the model. An example is {"bias": 4}. Metadata is intended for display in a model management UI and therefore must be a default Python type, such as str or int.


A tag that can be used to access the saved model from the BentoML model store.

Return type:



from fastai.metrics import accuracy
from import URLs
from import untar_data
from import TextDataLoaders
from fastai.text.models import AWD_LSTM
from fastai.text.learner import text_classifier_learner

dls = TextDataLoaders.from_folder(untar_data(URLs.IMDB), valid="test")

learner = text_classifier_learner(dls, AWD_LSTM, drop_mult=0.5, metrics=accuracy)
learner.fine_tune(4, 1e-2)

# Test run the model
learner.predict("I love that movie!")

# `Save` the model with BentoML
tag = bentoml.fastai.save_model("fai_learner", learner)
bentoml.fastai.load_model(bento_model: str | Tag | bentoml.Model) learner.Learner[source]#

Load the fastai.learner.Learner model instance with the given tag from the local BentoML model store.

If the model uses mixed_precision, then the loaded model will also be converted to FP32. Learn more about mixed precision.


bento_model – Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.


The fastai.learner.Learner model instance loaded from the model store or BentoML Model.

Return type:



import bentoml

model = bentoml.fastai.load_model("fai_learner")
results = model.predict("some input")
bentoml.fastai.get(tag_like: str | Tag) bentoml.Model[source]#

Get the BentoML model with the given tag.


tag_like – The tag of the model to retrieve from the model store.


A BentoML Model with the matching tag.

Return type:



import bentoml
# target model must be from the BentoML model store
model = bentoml.fastai.get("fai_learner")