How to slice a XGBClassifier/XGBRegressor model into sub-models?

This document shows that a XGBoost API trained model can be sliced by following code:

from sklearn.datasets import make_classification
import xgboost as xgb

booster = xgb.train({
    'num_parallel_tree': 4, 'subsample': 0.5, 'num_class': 3},
                    num_boost_round=num_boost_round, dtrain=dtrain)    
sliced: xgb.Booster = booster[3:7]

I tried it and it worked.

Since XGBoost provides Scikit-Learn Wrapper interface, I tried something like this:

from xgboost import XGBClassifier

clf_xgb = XGBClassifier().fit(X_train, y_train)
clf_xgb_sliced: clf_xgb.Booster = booster[3:7]

But got following error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-18-84155815d877> in <module>
----> 1 clf_xgb_sliced: clf_xgb.Booster = booster[3:7]

AttributeError: 'XGBClassifier' object has no attribute 'Booster'

Since XGBClassifier has no attribute 'Booster', is there any way to slice a Scikit-Learn Wrapper interface trained XGBClassifier(/XGBRegressor) model?

1 answer

  • answered 2022-05-06 10:54 Learning is a mess

    The problem is with the type hint you are giving clf_xgb.Booster which does not match an existing argument. Try:

    clf_xgb_sliced: xgb.Booster = clf_xgb.get_booster()[3:7]
    

    instead.

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum