Naive Bayesian errors and logistic regression regression rate

I tried to find out the correlation between the error rate and the number of functions in both of these models. I watched a few videos, and the creator of the video said that a simple model could be better than a complex model. Therefore, I decided that the more functions I have, the greater the error rate. This was not confirmed in my work, and when I had fewer opportunities, the level of errors increased. I'm not sure if I am doing this wrong, or if the guy in the video made a mistake. Can anyone explain this? I'm also curious how functions also relate to the error rate of regression regression.

+4
source share
1 answer

" ", , ( ), -.

x y naive Bayes p (x, y) = p (y) * p (x | y) (.. , "" ) p (y | x) . , p (y | x) , ( "" ).

:

  • , "", p (y | x) . , .
  • ( , ), "" , , , p (x | y) , . , , " " .

( ) , . , . (, L1/Lasso L2/Ridge) .

+23

All Articles