Gradient Boost on Vowpal Wabbit

Is there a way to use gradient enhancement in regression using Vowpal Wabbit? I use various methods that come with Vowpal Wabbit that are useful. I want to try raising the gradient with this, but I can't find a way to implement gradient enhancement on VW.

+5
source share
1 answer

The idea of increasing the gradient is that the ensemble model is built from weak black box models. You can certainly use VW as a black box, but note that VW does not offer decision trees, which are the most popular choice for weak black box models when upgrading. Acceleration generally reduces bias (and increases dispersion), so you need to make sure that VW models have low dispersion (without conversion). See the trade-off between deviations .

There are several abbreviations associated with raising and styling in VW:

  • --autolink N adds a connection function with the polynomial N, which can be considered a simple way to increase.
  • --log_multi K is an online algorithm for accelerating the classification of classes of K. See article . You can use it even for binary classification (K = 2), but not for regression.
  • --bootstrap M M-way bootstrap by online re-fetching. Use --bs_type=vote for classification and --bs_type=mean for regression. Please note that this is bagging without increasing.
  • --boosting N (added on 2015-06-17) online promotion with N weak students, see theoretical paper
+9
source

All Articles