ResNet18的18層代表的是帶有權重的 18層,包括卷積層和全連接層,不包括池化層和BN層。
Resnet論文給出的結構圖
結構解析:
-
首先是第一層卷積使用7∗77∗7大小的模板,步長爲2,padding爲3。之後進行BN,ReLU和maxpool。這些構成了第一部分卷積模塊conv1。
-
然後是四個stage,代碼中用make_layer()來生成stage,每個stage中有多個模塊,每個模塊叫做building block,resnet18= [2,2,2,2],就有8個building block。注意到他有兩種模塊BasicBlock和Bottleneck。resnet18和resnet34用的是BasicBlock,resnet50及以上用的是Bottleneck。無論BasicBlock還是Bottleneck模塊,都用到了shortcut connection連接方式:
-
BasicBlock架構中,主要使用了兩個3x3的卷積,然後進行BN,之後的
out += residual
這一句在輸出上疊加了輸入xx(注意到一開始定義了residual = x
)class BasicBlock(nn.Module): expansion = 1 def __init__(self, inplanes, planes, stride=1, downsample=None): super(BasicBlock, self).__init__() self.conv1 = conv3x3(inplanes, planes, stride) self.bn1 = nn.BatchNorm2d(planes) self.relu = nn.ReLU(inplace=True) self.conv2 = conv3x3(planes, planes) self.bn2 = nn.BatchNorm2d(planes) self.downsample = downsample self.stride = stride def forward(self, x): residual = x out = self.conv1(x) out = self.bn1(out) out = self.relu(out) out = self.conv2(out) out = self.bn2(out) if self.downsample is not None: residual = self.downsample(x) out += residual out = self.relu(out) return out
-
Bottleneck模塊使用1x1,3x3,1x1的卷積模板,使用Bottleneck結構可以減少網絡參數數量。他首先用1x1卷積將通道數縮減爲一半,3x3卷積維持通道數不變,1x1卷積將通道數放大爲4倍。則總體來看,經過這個模塊後通道數變爲兩倍。
class Bottleneck(nn.Module): expansion = 4 def __init__(self, inplanes, planes, stride=1, downsample=None): super(Bottleneck, self).__init__() self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False) self.bn1 = nn.BatchNorm2d(planes) self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride, padding=1, bias=False) self.bn2 = nn.BatchNorm2d(planes) self.conv3 = nn.Conv2d(planes, planes * 4, kernel_size=1, bias=False) self.bn3 = nn.BatchNorm2d(planes * 4) self.relu = nn.ReLU(inplace=True) self.downsample = downsample self.stride = stride def forward(self, x): residual = x out = self.conv1(x) out = self.bn1(out) out = self.relu(out) out = self.conv2(out) out = self.bn2(out) out = self.relu(out) out = self.conv3(out) out = self.bn3(out) if self.downsample is not None: residual = self.downsample(x) out += residual out = self.relu(out) return out
-
-
最後是avgpool和一個全連接層,映射到1000維上(因爲ImageNet有1000個類別)。
結構圖和更多細節:
參考https://www.jianshu.com/p/085f4c8256f1
https://blog.csdn.net/weixin_40548136/article/details/88820996
resnet18
resnet50