Experiments of Residual Network with Squeeze and Excitation

With rapid development of deep learning, we can see the huge advance in networks’ ability of feature extraction and recognition. The network went really deep because of the appearence of residual connection in deep convolutional neural network. Residual links connect between the beginning and the ending of convolutional blocks, solving gradient vanishing problem by directly transferring derivatives though itself without reduction when pass though convolutional layers. More details see the original paper (Link).


Picture From Squeeze-and-Excitation Networks

Share This Page:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.