With rapid development of deep learning, we can see the huge advance in networks’ ability of feature extraction and recognition. The network went really deep because of the appearence of residual connection in deep convolutional neural network. Residual links connect between the beginning and the ending of convolutional blocks, solving gradient vanishing problem by directly transferring derivatives though itself without reduction when pass though convolutional layers. More details see the original paper (Link).
TO BE CONTINUE
Picture From Squeeze-and-Excitation Networks