Backpropagation in convolutional neural networks

Consider a collapsed neural network with the following architecture:

CNN Architecture

Here C_irefers to the convolutional layer and refers to the average level of association . Corresponding to each layer there will be an output. Let relate to the error in the output layer (and the same for ).i ^ th P_ii ^ th delta ^ P_jP_jenter image description here

delta ^ P_2can be easily calculated using normal backpropagation equations since it is fully connected to the softmax layer. delta ^ C_2 can be calculated simply by using upsampling delta ^ P_2respectively (and multiplying by gradient output C_2), since we use the average pool.

C_2 P_1? , delta ^ P_1 delta ^ C_2?

Standford Deep Learning :

UFLDL Equation

:

  • My W_k ^ l (2x2) delta_k ^ l (6x6), ( , P_1 (13x13) P_2 (6x6)). .

  • , . . P_1 64 C_2 96 .

? - , ?

MATLAB .

+4
2

, . , .

P P , " ". " ", .

" " , .

:

  • - , , - " ", ,

  • - 1/(P P) ( ).

backpropagation

+2

, , . " P_1 64 , C_2 96 ", 2x2, W 2x2, 96x64x2x2 ( -4, / , ). " " . , W, 96x64x2x2 , ( ) 64x7x7, , 96x6x6 ( "" , 2x2 7x7 6x6). , W , - , .

; () .

0

All Articles