Consider a collapsed neural network with the following architecture:
Here refers to the convolutional layer and refers to the average level of association . Corresponding to each layer there will be an output. Let relate to the error in the output layer (and the same for ).
can be easily calculated using normal backpropagation equations since it is fully connected to the softmax layer. can be calculated simply by using upsampling respectively (and multiplying by gradient output ), since we use the average pool.
? , ?
Standford Deep Learning :
:
My (2x2) (6x6), ( , (13x13) (6x6)). .
, . . 64 96 .
? - , ?
MATLAB .
, . , .
P P , " ". " ", .
" " , .
- , , - " ", ,
- 1/(P P) ( ).
backpropagation
, , . " P_1 64 , C_2 96 ", 2x2, W 2x2, 96x64x2x2 ( -4, / , ). " " . , W, 96x64x2x2 , ( ) 64x7x7, , 96x6x6 ( "" , 2x2 7x7 6x6). , W , - , .
; () .