LightGBM¶
About this page
This is an API reference for using LightGBM in BentoML. Please refer to /frameworks/lightgbm for more information about how to use LightGBM in BentoML.
- bentoml.lightgbm.save_model(name: Tag | str, model: lgb.basic.Booster, *, signatures: dict[str, ModelSignatureDict] | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: t.List[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model ¶
Save a LightGBM model instance to the BentoML model store.
- Parameters:
name (
str
) – The name to give to the model in the BentoML store. This must be a validTag
name.model (
Booster
) – The LightGBM model (booster) to be saved.signatures (
dict[str, ModelSignatureDict]
, optional) – Signatures of predict methods to be used. If not provided, the signatures default to{"predict": {"batchable": False}}
. SeeModelSignature
for more details.labels (
dict[str, str]
, optional) – A default set of management labels to be associated with the model. An example is{"training-set": "data-1"}
.custom_objects (
dict[str, Any]
, optional) –Custom objects to be saved with the model. An example is
{"my-normalizer": normalizer}
.Custom objects are currently serialized with cloudpickle, but this implementation is subject to change.
external_modules (
List[ModuleType]
, optional, default toNone
) – user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration modulemetadata (
dict[str, Any]
, optional) –Metadata to be associated with the model. An example is
{"max_depth": 2}
.Metadata is intended for display in model management UI and therefore must be a default Python type, such as
str
orint
.
- Returns:
A
tag
with a format name:version where name is the user-defined model’s name, and a generated version by BentoML.- Return type:
Tag
Example:
import bentoml import lightgbm as lgb import pandas as pd # load a dataset df_train = pd.read_csv("regression.train", header=None, sep=" ") df_test = pd.read_csv("regression.test", header=None, sep=" ") y_train = df_train[0] y_test = df_test[0] X_train = df_train.drop(0, axis=1) X_test = df_test.drop(0, axis=1) # create dataset for lightgbm lgb_train = lgb.Dataset(X_train, y_train) lgb_eval = lgb.Dataset(X_test, y_test, reference=lgb_train) # specify your configurations as a dict params = { "boosting_type": "gbdt", "objective": "regression", "metric": {"l2", "l1"}, "num_leaves": 31, "learning_rate": 0.05, } # train gbm = lgb.train( params, lgb_train, num_boost_round=20, valid_sets=lgb_eval ) # save the booster to BentoML modelstore: bento_model = bentoml.lightgbm.save_model("my_lightgbm_model", gbm, booster_params=params)
- bentoml.lightgbm.load_model(bento_model: str | Tag | Model) lightgbm.basic.Booster ¶
Load the LightGBM model with the given tag from the local BentoML model store.
- Parameters:
bento_model (
str
|
Tag
|
Model
) – Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.- Returns:
The LightGBM model loaded from the model store or BentoML
Model
.- Return type:
Booster
Example:
import bentoml gbm = bentoml.lightgbm.load("my_lightgbm_model:latest")
- bentoml.lightgbm.get(tag: t.Union[Tag, str], *, _model_store: ModelStore = <simple_di.providers.SingletonFactory object>, model_aliases: t.Dict[str, str] = <simple_di.providers.Static object>) Model ¶
Get a model by tag. If the tag is a string, it will be looked up in the model_aliases dict.
- bentoml.lightgbm.get_service(model_name: str, **config: Unpack[ServiceConfig]) Service[t.Any] ¶
Get a BentoML service instance from a LightGBM model.
- Parameters:
model_name (
str
) – The name of the model to get the service for.**config (
Unpack[ServiceConfig]
) – Configuration options for the service.
- Returns:
A BentoML service instance that wraps the LightGBM model.
Example:
import bentoml service = bentoml.lightgbm.get_service("my_lightgbm_model")