首页 > 解决方案 > 为什么我在 CIFAR10 上训练的 AlexNet 不起作用?

问题描述

我正在尝试实现 AlexNet 并在 CIFAR10 上进行训练。然而,损失并没有减少。你能告诉我任何问题吗?

class Alexnet(nn.Module):
  def __init__(self):
    super().__init__()
    self.conv1=nn.Sequential(nn.Conv2d(3,64,3,2,1),nn.ReLU(),nn.MaxPool2d(2)) 
    self.conv2=nn.Sequential(nn.Conv2d(64,192,3,1,padding=2),nn.MaxPool2d(2)) 
    self.conv3=nn.Sequential(nn.Conv2d(192,384,3,1,padding=1),nn.ReLU()) 
    self.conv4=nn.Sequential(nn.Conv2d(384,256,3,1,padding=1),nn.ReLU()) 
    self.conv5=nn.Sequential(nn.Conv2d(256,256,3,1,padding=1),nn.ReLU(),nn.MaxPool2d(2)) 
    self.dropout=nn.Dropout(0.3)
    self.fc1=nn.Sequential(nn.Flatten(),nn.Linear(256*2*2,4096),nn.ReLU())
    self.fc2=nn.Sequential(nn.Linear(4096,4096),nn.ReLU())
    self.fc3=nn.Sequential(nn.Linear(4096,10),nn.ReLU())
  def forward(self,x):
    x=self.conv1(x)
    x=self.conv2(x)
    x=self.conv3(x)
    x=self.conv4(x)
    x=self.conv5(x)
    x=self.dropout(x)
    x=self.fc1(x)
    x=self.dropout(x)
    x=self.fc2(x)
    x=self.fc3(x)
    return x

标签: pytorch

解决方案


推荐阅读