The ResNet residual component has two types of blocks. One is Identity Block, and the other is Convolutional Block.
where x and H(x) are the input and output vectors of the layers. F(x, {Wi}) is the residual mapping function. {Wi} represents the convolutional layer weights. x is added to the result of F by the shortcut connection path.
Figure 1 illustrates the structure and detailed process of the Identity Block. Firstly, x is the input vector, and x has two paths to go. One is the main path where x will be transformed by the residual mapping function F. In the main path, F(x,{Wi}) is performed. The main path usually consists of two typical weight layers. One weight layer is composed of a 2D convolutional layer and a Batch-Norm layer which will conduct the Normalization of the channel axis. After the input x multiplied by the weights {Wi} in the first layer, the nonlinear activation function ‘ReLU’ will be used. And then the result will be into the next weight layer. At the same time, the input x will be fed into the other path called the shortcut path. In the shortcut connection, x will be directly added to the result from the main path. Finally, the combined result H(x) will be applied to the ReLU activation function.
where x and H(x) are the input and output vectors of the layers. F(x, {Wi}) is the residual mapping function. Different from Identity Block, in the shortcut path, the input x will also be transformed by the 2D convolutional layer in the Convolutional Block. Ws denotes the weights of the convolutional layer in the shortcut path.
Identity Block Structure
Figure 2 shows the structure and process of the Convolutional Block. The only difference from the Fig. 1 is the shortcut path. In Identity Block, the input x is directly added to the result of the main path. However, the input x will be fed into a weight layer then the result is combined with the main path.
Convolutional Block Structure
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.