local-response-normalization

All memes

A form of normalization used in alexnet. A kind of normalization applied per pixel across the results from different filters, in other words, if a convolution has $n$ out dimensions, the normalization for a given pixel in the output depends only on the values at the same pixel for all of the $n$ out dimensions. Specifically for a given pixel at depth/filter $i$ written as $a_{x,y}^i$, the normalized output is:

\begin{equation} a_{x,y}^i / \left(k + \alpha \sum^{\operatorname{min}(N-1, i+n/2)}{\operatorname{max}(0, i-n/2)}(a^i)^2\right) \end

which is taken directly from the original alexnet paper, p. 4.