What does the tf.nn.lrn () method do?

Here is the code snippet from the cifar10 tutorial. This is from cifar10.py.

# conv1 with tf.variable_scope('conv1') as scope: kernel = _variable_with_weight_decay('weights', shape=[5, 5, 3, 64], stddev=1e-4, wd=0.0) conv = tf.nn.conv2d(images, kernel, [1, 1, 1, 1], padding='SAME') biases = _variable_on_cpu('biases', [64], tf.constant_initializer(0.0)) bias = tf.nn.bias_add(conv, biases) conv1 = tf.nn.relu(bias, name=scope.name) _activation_summary(conv1) # pool1 pool1 = tf.nn.max_pool(conv1, ksize=[1, 3, 3, 1], strides=[1, 2, 2, 1], padding='SAME', name='pool1') # norm1 norm1 = tf.nn.lrn(pool1, 4, bias=1.0, alpha=0.001 / 9.0, beta=0.75, name='norm1') 

What does tf.nn.lrn-Method do? I can't find the definition in the API documentation at https://www.tensorflow.org/versions/r0.8/api_docs/python/index.html

+7
tensorflow
source share
2 answers

tf.nn.lrn is short for tf.nn.local_response_normalization . Therefore, the documentation you can see is: https://www.tensorflow.org/api_docs/python/tf/nn/local_response_normalization

+9
source share

As nessuno pointed out, tf.nn.lrn abbreviated for tf.nn.local_response_normalization ( documentation )

In addition, this question provides good resources for more information at the levels of normalization of responses.

From: http://caffe.berkeleyvision.org/tutorial/layers.html#data-layers

"The level of normalization of the local response performs a kind of" lateral braking "by normalizing to the local input regions. In ACROSS_CHANNELS mode, local areas propagate along adjacent channels, but do not have a spatial extension (that is, they have the form local_size x 1 x 1). WITHIN_CHANNEL mode local regions extend spatially, but are in separate channels (that is, they have the form 1 x local_size x local_size). Each input value is divided by (1+ (α / n) Σix2i) β, where n is the size of each local region, and the amount is taken over the area centered n this value (zero padding is added if necessary).

These layers went out of fashion because they had very little effect on the results, while other methods proved to be more useful.

+3
source share

All Articles