Why are too many features causing excessive installation?

I searched for the night and did not have permission.

Too many functions call too many parameters, but what is the relationship between the number of parameters and the wiggly curve?

+4
source share
4 answers

In computer training, you break down your data into a training set and a test set. The training set is used to match the model (setting model parameters), the test set is used to assess how well your model will work with invisible data.

Overfitting means your model does much better on a training kit than on a test kit. He approaches the training data too well and does not generalize well.

Overfitting :

  • . 100. 5 , .
  • . .
  • . . , . (x1, x2,..., xn) n , 1, .
+3

, - , (), "" ( ). , , , , "" .. - , , , , , "".

, . , - "" , , ,

f(x) = cos(<w, x>)

w, "", -1, +1

+1

. , , , , .

, , , . , 5 6 , .

, , . 3 . 5 , , 5 * 3 = 125 . 20 , , .

, . , (, , , ), 2 , .

0

, n+1 . , m < n+1 ( ). , . , , m = > theta0*x0_i + theta1*x1_i ... + thetaN*xN_i = y_i (for i = 1,...m). , . , (n + 1-m) m , , (). , , Gradient descent 0, m, , , . m > n+1, it would be difficult to find variables that ideally satisfy the equations m. Hope this clears your doubts.

0
source

All Articles