fast.aiΒΆ
About this page
This is an API reference for using FastAI in BentoML. Please refer to fastai guide for more information about how to use FastAI in BentoML.
- bentoml.fastai.save_model(name: Tag | str, learner_: learner.Learner, *, signatures: ModelSignaturesType | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: t.List[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model ΒΆ
Save a
fastai.learner.Learner
model instance to the BentoML model store.If the
save_model()
method failed while saving a given learner, your learner may contain aCallback
that is not picklable. All FastAI callbacks are stateful, which makes some of them not picklable. UseLearner.remove_cbs()
to remove unpicklable callbacks.- Parameters:
name β The name to give to the model in the BentoML store. This must be a valid
Tag
name.learner β
Learner
to be saved.signatures β Signatures of predict methods to be used. If not provided, the signatures default to
predict
. SeeModelSignature
for more details.labels β A default set of management labels to be associated with the model. An example is
{"training-set": "data-1"}
.custom_objects β Custom objects to be saved with the model. An example is
{"my-normalizer": normalizer}
. Custom objects are currently serialized with cloudpickle, but this implementation is subject to change.external_modules (
List[ModuleType]
, optional, default toNone
) β user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration modulemetadata β Metadata to be associated with the model. An example is
{"bias": 4}
. Metadata is intended for display in a model management UI and therefore must be a default Python type, such asstr
orint
.
- Returns:
A tag that can be used to access the saved model from the BentoML model store.
- Return type:
Tag
Example:
from fastai.metrics import accuracy from fastai.text.data import URLs from fastai.text.data import untar_data from fastai.text.data import TextDataLoaders from fastai.text.models import AWD_LSTM from fastai.text.learner import text_classifier_learner dls = TextDataLoaders.from_folder(untar_data(URLs.IMDB), valid="test") learner = text_classifier_learner(dls, AWD_LSTM, drop_mult=0.5, metrics=accuracy) learner.fine_tune(4, 1e-2) # Test run the model learner.model.eval() learner.predict("I love that movie!") # `Save` the model with BentoML tag = bentoml.fastai.save_model("fai_learner", learner)
- bentoml.fastai.load_model(bento_model: str | Tag | bentoml.Model) learner.Learner ΒΆ
Load the
fastai.learner.Learner
model instance with the given tag from the local BentoML model store.If the model uses
mixed_precision
, then the loaded model will also be converted to FP32. Learn more about mixed precision.- Parameters:
bento_model β Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.
- Returns:
The
fastai.learner.Learner
model instance loaded from the model store or BentoMLModel
.- Return type:
fastai.learner.Learner
Example:
import bentoml model = bentoml.fastai.load_model("fai_learner") results = model.predict("some input")
- bentoml.fastai.get(tag_like: str | Tag) bentoml.Model ΒΆ
Get the BentoML model with the given tag.
- Parameters:
tag_like β The tag of the model to retrieve from the model store.
- Returns:
A BentoML
Model
with the matching tag.- Return type:
Model
Example:
import bentoml # target model must be from the BentoML model store model = bentoml.fastai.get("fai_learner")