Muti output regression in xgboost

Is it possible to prepare a model in Xgboost with several continuous outputs (multi-regression)? What would be the purpose of preparing such a model?

Thanks in advance for any suggestions.

+5
source share
1 answer

My suggestion is to use sklearn.multioutput.MultiOutputRegressor as a wrapper for xgb.XGBRegressor . MultiOutputRegressor trains one regressor per target and only requires the regressor to implement fit and predict , which xgboost supports.

 # get some noised linear data X = np.random.random((1000, 10)) a = np.random.random((10, 3)) y = np.dot(X, a) + np.random.normal(0, 1e-3, (1000, 3)) # fitting multioutputregressor = MultiOutputRegressor(xgb.XGBRegressor(objective='reg:linear')).fit(X, y) # predicting print np.mean((multioutputregressor.predict(X) - y)**2, axis=0) # 0.004, 0.003, 0.005 

This is perhaps the easiest way to regress multidimensional targets using xgboost, since you will not need to change any other part of your code (if you originally used the sklearn API).

However, this method does not use any possible connection between goals. But you can try to create a custom task to achieve this.

+3
source

All Articles