MLflow¶
About this page
This is an API reference for using MLflow in BentoML. Please refer to MLflow guide for more information about how to use MLflow in BentoML.
Note
You can find more examples for MLflow in our examples/mlflow directory.
- bentoml.mlflow.import_model(name: Tag | str, model_uri: str, *, signatures: dict[str, ModelSignature] | dict[str, ModelSignatureDict] | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: t.List[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model ¶
Import MLflow model from a artifact URI to the BentoML model store.
- Parameters:
name – The name to give to the model in the BentoML store. This must be a valid
Tag
name.model_uri – The MLflow model to be saved.
signatures – Signatures of predict methods to be used. If not provided, the signatures default to {“predict”: {“batchable”: False}}. See
ModelSignature
for more details.labels – A default set of management labels to be associated with the model. For example:
{"training-set": "data-v1"}
.custom_objects – Custom objects to be saved with the model. An example is
{"my-normalizer": normalizer}
. Custom objects are serialized with cloudpickle.metadata –
Metadata to be associated with the model. An example is
{"param_a": .2}
.Metadata is intended for display in a model management UI and therefore all values in metadata dictionary must be a primitive Python type, such as
str
orint
.
- Returns:
A
Model
instance referencing a saved model in the local BentoML model store.
Example:
import bentoml bentoml.mlflow.import_model( 'my_mlflow_model', model_uri="runs:/<mlflow_run_id>/run-relative/path/to/model", signatures={ "predict": {"batchable": True}, } )
- bentoml.mlflow.load_model(bento_model: str | Tag | Model) mlflow.pyfunc.PyFuncModel ¶
Load the MLflow PyFunc model with the given tag from the local BentoML model store.
- Parameters:
bento_model – Either the tag of the model to get from the store, or a BentoML
~bentoml.Model
instance to load the model from.- Returns:
The MLflow model loaded as PyFuncModel from the BentoML model store.
Example:
import bentoml pyfunc_model = bentoml.mlflow.load_model('my_model:latest') pyfunc_model.predict( input_df )
- bentoml.mlflow.get(tag: t.Union[Tag, str], *, _model_store: ModelStore = <simple_di.providers.SingletonFactory object>, model_aliases: t.Dict[str, str] = <simple_di.providers.Static object>) Model ¶
Get a model by tag. If the tag is a string, it will be looked up in the model_aliases dict.
- bentoml.mlflow.get_service(model_name: str, **config: Unpack[ServiceConfig]) Service[t.Any] ¶
Get a BentoML service from a MLflow model.
- Parameters:
model_name – The name of the model to load.
**config – Additional configuration for the service.
- Returns:
A BentoML service instance that can be used to serve the MLflow model.
Example:
import bentoml service = bentoml.mlflow.get_service("my_mlflow_model")