I tried to find out the correlation between the error rate and the number of functions in both of these models. I watched a few videos, and the creator of the video said that a simple model could be better than a complex model. Therefore, I decided that the more functions I have, the greater the error rate. This was not confirmed in my work, and when I had fewer opportunities, the level of errors increased. I'm not sure if I am doing this wrong, or if the guy in the video made a mistake. Can anyone explain this? I'm also curious how functions also relate to the error rate of regression regression.
" ", , ( ), -.
x y naive Bayes p (x, y) = p (y) * p (x | y) (.. , "" ) p (y | x) . , p (y | x) , ( "" ).
:
( ) , . , . (, L1/Lasso L2/Ridge) .