About this page

This is an API reference for using LightGBM in BentoML. Please refer to /frameworks/lightgbm for more information about how to use LightGBM in BentoML.

bentoml.lightgbm.save_model(name: Tag | str, model: lgb.basic.Booster, *, signatures: dict[str, ModelSignatureDict] | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: t.List[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model[source]#

Save a LightGBM model instance to the BentoML model store.

  • name (str) – The name to give to the model in the BentoML store. This must be a valid Tag name.

  • model (Booster) – The LightGBM model (booster) to be saved.

  • signatures (dict[str, ModelSignatureDict], optional) – Signatures of predict methods to be used. If not provided, the signatures default to {"predict": {"batchable": False}}. See ModelSignature for more details.

  • labels (dict[str, str], optional) – A default set of management labels to be associated with the model. An example is {"training-set": "data-1"}.

  • custom_objects (dict[str, Any], optional) –

    Custom objects to be saved with the model. An example is {"my-normalizer": normalizer}.

    Custom objects are currently serialized with cloudpickle, but this implementation is subject to change.

  • external_modules (List[ModuleType], optional, default to None) – user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration module

  • metadata (dict[str, Any], optional) –

    Metadata to be associated with the model. An example is {"max_depth": 2}.

    Metadata is intended for display in model management UI and therefore must be a default Python type, such as str or int.


A tag with a format name:version where name is the user-defined model’s name, and a generated version by BentoML.

Return type:



import bentoml

import lightgbm as lgb
import pandas as pd

# load a dataset
df_train = pd.read_csv("regression.train", header=None, sep="   ")
df_test = pd.read_csv("regression.test", header=None, sep="     ")

y_train = df_train[0]
y_test = df_test[0]
X_train = df_train.drop(0, axis=1)
X_test = df_test.drop(0, axis=1)

# create dataset for lightgbm
lgb_train = lgb.Dataset(X_train, y_train)
lgb_eval = lgb.Dataset(X_test, y_test, reference=lgb_train)

# specify your configurations as a dict
params = {
    "boosting_type": "gbdt",
    "objective": "regression",
    "metric": {"l2", "l1"},
    "num_leaves": 31,
    "learning_rate": 0.05,

# train
gbm = lgb.train(
    params, lgb_train, num_boost_round=20, valid_sets=lgb_eval

# save the booster to BentoML modelstore:
bento_model = bentoml.lightgbm.save_model("my_lightgbm_model", gbm, booster_params=params)
bentoml.lightgbm.load_model(bento_model: str | Tag | Model) lightgbm.basic.Booster[source]#

Load the LightGBM model with the given tag from the local BentoML model store.


bento_model (str | Tag | Model) – Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.


The LightGBM model loaded from the model store or BentoML Model.

Return type:



import bentoml
gbm = bentoml.lightgbm.load("my_lightgbm_model:latest")
bentoml.lightgbm.get(tag_like: str | Tag) Model[source]#

Get the BentoML model with the given tag.


tag_like (str | Tag) – The tag of the model to retrieve from the model store.


A BentoML Model with the matching tag.

Return type:



import bentoml
# target model must be from the BentoML model store
model = bentoml.lightgbm.get("my_lightgbm_model:latest")