Xgboost sklearn 0 wrapper value for num_class parameter must be greater than 1

I am trying to use the XGBClassifier wrapper provided by sklearn for a multiclass problem. My classes are [0, 1, 2], the goal I am using is multi:softmax . When I try to set the classifier, I get

xgboost.core.XGBoostError: value 0 for parameter num_class must be greater than 1

If I try to set the num_class parameter, I get an error

received an unexpected keyword argument 'num_class'

Sklearn sets this parameter automatically, so I should not pass this argument. But why am I getting the first error?

+6
source share
2 answers

You need to manually add the num_class parameter to xgb_param

  # Model is an XGBClassifier xgb_param = model.get_xgb_params() xgb_param['num_class'] = 3 cvresult = xgb.cv(xgb_param, ...) 

XGBClassifier sets this value automatically if you use its fit method, but not in the cv method

+3
source

Do you use the xgboost.cv function? I ran into the same problems but found a solution. Here is my code:

  xgb_param = model.get_xgb_params() extra = {'num_class': 3} xgb_param.update(extra) cvresult = xgb.cv(xgb_param, xgtrain, ...) 

xgb_param is an XGBoost model parameter dictionary. Then I add a new dict extra to it to indicate num_class , pass the new dict to the cv function. It works.

-one
source

All Articles