model - 我可以使用具有不同输入通道大小的预训练模型吗?
问题描述
预训练模型接受这样的输入形状;
[batch_Size, Channels, Depth, Height, Width]
[32, 3, 16, 224,224]
我想给它;
[batch_Size, Channels, Depth, Height, Width]
[32, 2, 16, 224,224]
它给了我这个错误;
据说第一层的预训练模型的权重是[16,3,1,3,3],它期望有3个通道,但它却得到了2个通道。我还在 state_dict_load 中保留了 strict = False,但仍然给我这个错误,我是否必须通过每个权重并更改减少通道?还是我应该从头开始训练它?
这是预训练的模型层及其输出;
conv1.conv_1.conv3d.weight : torch.Size([16, 3, 1, 3, 3])
conv1.conv_1.norm.weight : torch.Size([16])
conv1.conv_1.norm.bias : torch.Size([16])
blocks.b0_l0.alpha : torch.Size([])
blocks.b0_l0.expand.conv_1.conv3d.weight : torch.Size([40, 16, 1, 1, 1])
blocks.b0_l0.expand.conv_1.norm.weight : torch.Size([40])
blocks.b0_l0.expand.conv_1.norm.bias : torch.Size([40])
blocks.b0_l0.deep.conv_1.conv3d.weight : torch.Size([40, 1, 1, 5, 5])
blocks.b0_l0.deep.conv_1.norm.weight : torch.Size([40])
blocks.b0_l0.deep.conv_1.norm.bias : torch.Size([40])
blocks.b0_l0.se.fc1.conv_1.conv3d.weight : torch.Size([16, 40, 1, 1, 1])
blocks.b0_l0.se.fc1.conv_1.conv3d.bias : torch.Size([16])
blocks.b0_l0.se.fc2.conv_1.conv3d.weight : torch.Size([40, 16, 1, 1, 1])
blocks.b0_l0.se.fc2.conv_1.conv3d.bias : torch.Size([40])
blocks.b0_l0.project.conv_1.conv3d.weight : torch.Size([16, 40, 1, 1, 1])
blocks.b0_l0.project.conv_1.norm.weight : torch.Size([16])
blocks.b0_l0.project.conv_1.norm.bias : torch.Size([16])
blocks.b0_l0.res.1.conv_1.conv3d.weight : torch.Size([16, 16, 1, 1, 1])
blocks.b0_l0.res.1.conv_1.norm.weight : torch.Size([16])
blocks.b0_l0.res.1.conv_1.norm.bias : torch.Size([16])
blocks.b0_l1.alpha : torch.Size([])
blocks.b0_l1.expand.conv_1.conv3d.weight : torch.Size([40, 16, 1, 1, 1])
blocks.b0_l1.expand.conv_1.norm.weight : torch.Size([40])
blocks.b0_l1.expand.conv_1.norm.bias : torch.Size([40])
blocks.b0_l1.deep.conv_1.conv3d.weight : torch.Size([40, 1, 3, 3, 3])
blocks.b0_l1.deep.conv_1.norm.weight : torch.Size([40])
blocks.b0_l1.deep.conv_1.norm.bias : torch.Size([40])
blocks.b0_l1.se.fc1.conv_1.conv3d.weight : torch.Size([16, 40, 1, 1, 1])
blocks.b0_l1.se.fc1.conv_1.conv3d.bias : torch.Size([16])
blocks.b0_l1.se.fc2.conv_1.conv3d.weight : torch.Size([40, 16, 1, 1, 1])
blocks.b0_l1.se.fc2.conv_1.conv3d.bias : torch.Size([40])
blocks.b0_l1.project.conv_1.conv3d.weight : torch.Size([16, 40, 1, 1, 1])
blocks.b0_l1.project.conv_1.norm.weight : torch.Size([16])
blocks.b0_l1.project.conv_1.norm.bias : torch.Size([16])
blocks.b0_l2.alpha : torch.Size([])
blocks.b0_l2.expand.conv_1.conv3d.weight : torch.Size([64, 16, 1, 1, 1])
blocks.b0_l2.expand.conv_1.norm.weight : torch.Size([64])
blocks.b0_l2.expand.conv_1.norm.bias : torch.Size([64])
blocks.b0_l2.deep.conv_1.conv3d.weight : torch.Size([64, 1, 3, 3, 3])
blocks.b0_l2.deep.conv_1.norm.weight : torch.Size([64])
blocks.b0_l2.deep.conv_1.norm.bias : torch.Size([64])
blocks.b0_l2.se.fc1.conv_1.conv3d.weight : torch.Size([16, 64, 1, 1, 1])
blocks.b0_l2.se.fc1.conv_1.conv3d.bias : torch.Size([16])
blocks.b0_l2.se.fc2.conv_1.conv3d.weight : torch.Size([64, 16, 1, 1, 1])
blocks.b0_l2.se.fc2.conv_1.conv3d.bias : torch.Size([64])
blocks.b0_l2.project.conv_1.conv3d.weight : torch.Size([16, 64, 1, 1, 1])
blocks.b0_l2.project.conv_1.norm.weight : torch.Size([16])
blocks.b0_l2.project.conv_1.norm.bias : torch.Size([16])
blocks.b1_l0.alpha : torch.Size([])
blocks.b1_l0.expand.conv_1.conv3d.weight : torch.Size([96, 16, 1, 1, 1])
blocks.b1_l0.expand.conv_1.norm.weight : torch.Size([96])
blocks.b1_l0.expand.conv_1.norm.bias : torch.Size([96])
blocks.b1_l0.deep.conv_1.conv3d.weight : torch.Size([96, 1, 3, 3, 3])
blocks.b1_l0.deep.conv_1.norm.weight : torch.Size([96])
blocks.b1_l0.deep.conv_1.norm.bias : torch.Size([96])
blocks.b1_l0.se.fc1.conv_1.conv3d.weight : torch.Size([24, 96, 1, 1, 1])
blocks.b1_l0.se.fc1.conv_1.conv3d.bias : torch.Size([24])
blocks.b1_l0.se.fc2.conv_1.conv3d.weight : torch.Size([96, 24, 1, 1, 1])
blocks.b1_l0.se.fc2.conv_1.conv3d.bias : torch.Size([96])
blocks.b1_l0.project.conv_1.conv3d.weight : torch.Size([40, 96, 1, 1, 1])
blocks.b1_l0.project.conv_1.norm.weight : torch.Size([40])
blocks.b1_l0.project.conv_1.norm.bias : torch.Size([40])
blocks.b1_l0.res.1.conv_1.conv3d.weight : torch.Size([40, 16, 1, 1, 1])
blocks.b1_l0.res.1.conv_1.norm.weight : torch.Size([40])
blocks.b1_l0.res.1.conv_1.norm.bias : torch.Size([40])
blocks.b1_l1.alpha : torch.Size([])
blocks.b1_l1.expand.conv_1.conv3d.weight : torch.Size([120, 40, 1, 1, 1])
blocks.b1_l1.expand.conv_1.norm.weight : torch.Size([120])
blocks.b1_l1.expand.conv_1.norm.bias : torch.Size([120])
blocks.b1_l1.deep.conv_1.conv3d.weight : torch.Size([120, 1, 3, 3, 3])
blocks.b1_l1.deep.conv_1.norm.weight : torch.Size([120])
blocks.b1_l1.deep.conv_1.norm.bias : torch.Size([120])
blocks.b1_l1.se.fc1.conv_1.conv3d.weight : torch.Size([32, 120, 1, 1, 1])
blocks.b1_l1.se.fc1.conv_1.conv3d.bias : torch.Size([32])
blocks.b1_l1.se.fc2.conv_1.conv3d.weight : torch.Size([120, 32, 1, 1, 1])
blocks.b1_l1.se.fc2.conv_1.conv3d.bias : torch.Size([120])
blocks.b1_l1.project.conv_1.conv3d.weight : torch.Size([40, 120, 1, 1, 1])
blocks.b1_l1.project.conv_1.norm.weight : torch.Size([40])
blocks.b1_l1.project.conv_1.norm.bias : torch.Size([40])
blocks.b1_l2.alpha : torch.Size([])
blocks.b1_l2.expand.conv_1.conv3d.weight : torch.Size([96, 40, 1, 1, 1])
blocks.b1_l2.expand.conv_1.norm.weight : torch.Size([96])
blocks.b1_l2.expand.conv_1.norm.bias : torch.Size([96])
blocks.b1_l2.deep.conv_1.conv3d.weight : torch.Size([96, 1, 3, 3, 3])
blocks.b1_l2.deep.conv_1.norm.weight : torch.Size([96])
blocks.b1_l2.deep.conv_1.norm.bias : torch.Size([96])
blocks.b1_l2.se.fc1.conv_1.conv3d.weight : torch.Size([24, 96, 1, 1, 1])
blocks.b1_l2.se.fc1.conv_1.conv3d.bias : torch.Size([24])
blocks.b1_l2.se.fc2.conv_1.conv3d.weight : torch.Size([96, 24, 1, 1, 1])
blocks.b1_l2.se.fc2.conv_1.conv3d.bias : torch.Size([96])
blocks.b1_l2.project.conv_1.conv3d.weight : torch.Size([40, 96, 1, 1, 1])
blocks.b1_l2.project.conv_1.norm.weight : torch.Size([40])
blocks.b1_l2.project.conv_1.norm.bias : torch.Size([40])
blocks.b1_l3.alpha : torch.Size([])
blocks.b1_l3.expand.conv_1.conv3d.weight : torch.Size([96, 40, 1, 1, 1])
blocks.b1_l3.expand.conv_1.norm.weight : torch.Size([96])
blocks.b1_l3.expand.conv_1.norm.bias : torch.Size([96])
blocks.b1_l3.deep.conv_1.conv3d.weight : torch.Size([96, 1, 3, 3, 3])
blocks.b1_l3.deep.conv_1.norm.weight : torch.Size([96])
blocks.b1_l3.deep.conv_1.norm.bias : torch.Size([96])
blocks.b1_l3.se.fc1.conv_1.conv3d.weight : torch.Size([24, 96, 1, 1, 1])
blocks.b1_l3.se.fc1.conv_1.conv3d.bias : torch.Size([24])
blocks.b1_l3.se.fc2.conv_1.conv3d.weight : torch.Size([96, 24, 1, 1, 1])
blocks.b1_l3.se.fc2.conv_1.conv3d.bias : torch.Size([96])
blocks.b1_l3.project.conv_1.conv3d.weight : torch.Size([40, 96, 1, 1, 1])
blocks.b1_l3.project.conv_1.norm.weight : torch.Size([40])
blocks.b1_l3.project.conv_1.norm.bias : torch.Size([40])
blocks.b1_l4.alpha : torch.Size([])
blocks.b1_l4.expand.conv_1.conv3d.weight : torch.Size([120, 40, 1, 1, 1])
blocks.b1_l4.expand.conv_1.norm.weight : torch.Size([120])
blocks.b1_l4.expand.conv_1.norm.bias : torch.Size([120])
blocks.b1_l4.deep.conv_1.conv3d.weight : torch.Size([120, 1, 3, 3, 3])
blocks.b1_l4.deep.conv_1.norm.weight : torch.Size([120])
blocks.b1_l4.deep.conv_1.norm.bias : torch.Size([120])
blocks.b1_l4.se.fc1.conv_1.conv3d.weight : torch.Size([32, 120, 1, 1, 1])
blocks.b1_l4.se.fc1.conv_1.conv3d.bias : torch.Size([32])
blocks.b1_l4.se.fc2.conv_1.conv3d.weight : torch.Size([120, 32, 1, 1, 1])
blocks.b1_l4.se.fc2.conv_1.conv3d.bias : torch.Size([120])
blocks.b1_l4.project.conv_1.conv3d.weight : torch.Size([40, 120, 1, 1, 1])
blocks.b1_l4.project.conv_1.norm.weight : torch.Size([40])
blocks.b1_l4.project.conv_1.norm.bias : torch.Size([40])
blocks.b2_l0.alpha : torch.Size([])
blocks.b2_l0.expand.conv_1.conv3d.weight : torch.Size([240, 40, 1, 1, 1])
blocks.b2_l0.expand.conv_1.norm.weight : torch.Size([240])
blocks.b2_l0.expand.conv_1.norm.bias : torch.Size([240])
blocks.b2_l0.deep.conv_1.conv3d.weight : torch.Size([240, 1, 5, 3, 3])
blocks.b2_l0.deep.conv_1.norm.weight : torch.Size([240])
blocks.b2_l0.deep.conv_1.norm.bias : torch.Size([240])
blocks.b2_l0.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b2_l0.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b2_l0.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b2_l0.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b2_l0.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b2_l0.project.conv_1.norm.weight : torch.Size([72])
blocks.b2_l0.project.conv_1.norm.bias : torch.Size([72])
blocks.b2_l0.res.1.conv_1.conv3d.weight : torch.Size([72, 40, 1, 1, 1])
blocks.b2_l0.res.1.conv_1.norm.weight : torch.Size([72])
blocks.b2_l0.res.1.conv_1.norm.bias : torch.Size([72])
blocks.b2_l1.alpha : torch.Size([])
blocks.b2_l1.expand.conv_1.conv3d.weight : torch.Size([160, 72, 1, 1, 1])
blocks.b2_l1.expand.conv_1.norm.weight : torch.Size([160])
blocks.b2_l1.expand.conv_1.norm.bias : torch.Size([160])
blocks.b2_l1.deep.conv_1.conv3d.weight : torch.Size([160, 1, 3, 3, 3])
blocks.b2_l1.deep.conv_1.norm.weight : torch.Size([160])
blocks.b2_l1.deep.conv_1.norm.bias : torch.Size([160])
blocks.b2_l1.se.fc1.conv_1.conv3d.weight : torch.Size([40, 160, 1, 1, 1])
blocks.b2_l1.se.fc1.conv_1.conv3d.bias : torch.Size([40])
blocks.b2_l1.se.fc2.conv_1.conv3d.weight : torch.Size([160, 40, 1, 1, 1])
blocks.b2_l1.se.fc2.conv_1.conv3d.bias : torch.Size([160])
blocks.b2_l1.project.conv_1.conv3d.weight : torch.Size([72, 160, 1, 1, 1])
blocks.b2_l1.project.conv_1.norm.weight : torch.Size([72])
blocks.b2_l1.project.conv_1.norm.bias : torch.Size([72])
blocks.b2_l2.alpha : torch.Size([])
blocks.b2_l2.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b2_l2.expand.conv_1.norm.weight : torch.Size([240])
blocks.b2_l2.expand.conv_1.norm.bias : torch.Size([240])
blocks.b2_l2.deep.conv_1.conv3d.weight : torch.Size([240, 1, 3, 3, 3])
blocks.b2_l2.deep.conv_1.norm.weight : torch.Size([240])
blocks.b2_l2.deep.conv_1.norm.bias : torch.Size([240])
blocks.b2_l2.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b2_l2.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b2_l2.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b2_l2.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b2_l2.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b2_l2.project.conv_1.norm.weight : torch.Size([72])
blocks.b2_l2.project.conv_1.norm.bias : torch.Size([72])
blocks.b2_l3.alpha : torch.Size([])
blocks.b2_l3.expand.conv_1.conv3d.weight : torch.Size([192, 72, 1, 1, 1])
blocks.b2_l3.expand.conv_1.norm.weight : torch.Size([192])
blocks.b2_l3.expand.conv_1.norm.bias : torch.Size([192])
blocks.b2_l3.deep.conv_1.conv3d.weight : torch.Size([192, 1, 3, 3, 3])
blocks.b2_l3.deep.conv_1.norm.weight : torch.Size([192])
blocks.b2_l3.deep.conv_1.norm.bias : torch.Size([192])
blocks.b2_l3.se.fc1.conv_1.conv3d.weight : torch.Size([48, 192, 1, 1, 1])
blocks.b2_l3.se.fc1.conv_1.conv3d.bias : torch.Size([48])
blocks.b2_l3.se.fc2.conv_1.conv3d.weight : torch.Size([192, 48, 1, 1, 1])
blocks.b2_l3.se.fc2.conv_1.conv3d.bias : torch.Size([192])
blocks.b2_l3.project.conv_1.conv3d.weight : torch.Size([72, 192, 1, 1, 1])
blocks.b2_l3.project.conv_1.norm.weight : torch.Size([72])
blocks.b2_l3.project.conv_1.norm.bias : torch.Size([72])
blocks.b2_l4.alpha : torch.Size([])
blocks.b2_l4.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b2_l4.expand.conv_1.norm.weight : torch.Size([240])
blocks.b2_l4.expand.conv_1.norm.bias : torch.Size([240])
blocks.b2_l4.deep.conv_1.conv3d.weight : torch.Size([240, 1, 3, 3, 3])
blocks.b2_l4.deep.conv_1.norm.weight : torch.Size([240])
blocks.b2_l4.deep.conv_1.norm.bias : torch.Size([240])
blocks.b2_l4.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b2_l4.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b2_l4.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b2_l4.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b2_l4.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b2_l4.project.conv_1.norm.weight : torch.Size([72])
blocks.b2_l4.project.conv_1.norm.bias : torch.Size([72])
blocks.b3_l0.alpha : torch.Size([])
blocks.b3_l0.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b3_l0.expand.conv_1.norm.weight : torch.Size([240])
blocks.b3_l0.expand.conv_1.norm.bias : torch.Size([240])
blocks.b3_l0.deep.conv_1.conv3d.weight : torch.Size([240, 1, 5, 3, 3])
blocks.b3_l0.deep.conv_1.norm.weight : torch.Size([240])
blocks.b3_l0.deep.conv_1.norm.bias : torch.Size([240])
blocks.b3_l0.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b3_l0.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b3_l0.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b3_l0.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b3_l0.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b3_l0.project.conv_1.norm.weight : torch.Size([72])
blocks.b3_l0.project.conv_1.norm.bias : torch.Size([72])
blocks.b3_l1.alpha : torch.Size([])
blocks.b3_l1.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b3_l1.expand.conv_1.norm.weight : torch.Size([240])
blocks.b3_l1.expand.conv_1.norm.bias : torch.Size([240])
blocks.b3_l1.deep.conv_1.conv3d.weight : torch.Size([240, 1, 3, 3, 3])
blocks.b3_l1.deep.conv_1.norm.weight : torch.Size([240])
blocks.b3_l1.deep.conv_1.norm.bias : torch.Size([240])
blocks.b3_l1.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b3_l1.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b3_l1.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b3_l1.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b3_l1.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b3_l1.project.conv_1.norm.weight : torch.Size([72])
blocks.b3_l1.project.conv_1.norm.bias : torch.Size([72])
blocks.b3_l2.alpha : torch.Size([])
blocks.b3_l2.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b3_l2.expand.conv_1.norm.weight : torch.Size([240])
blocks.b3_l2.expand.conv_1.norm.bias : torch.Size([240])
blocks.b3_l2.deep.conv_1.conv3d.weight : torch.Size([240, 1, 3, 3, 3])
blocks.b3_l2.deep.conv_1.norm.weight : torch.Size([240])
blocks.b3_l2.deep.conv_1.norm.bias : torch.Size([240])
blocks.b3_l2.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b3_l2.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b3_l2.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b3_l2.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b3_l2.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b3_l2.project.conv_1.norm.weight : torch.Size([72])
blocks.b3_l2.project.conv_1.norm.bias : torch.Size([72])
blocks.b3_l3.alpha : torch.Size([])
blocks.b3_l3.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b3_l3.expand.conv_1.norm.weight : torch.Size([240])
blocks.b3_l3.expand.conv_1.norm.bias : torch.Size([240])
blocks.b3_l3.deep.conv_1.conv3d.weight : torch.Size([240, 1, 3, 3, 3])
blocks.b3_l3.deep.conv_1.norm.weight : torch.Size([240])
blocks.b3_l3.deep.conv_1.norm.bias : torch.Size([240])
blocks.b3_l3.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b3_l3.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b3_l3.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b3_l3.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b3_l3.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b3_l3.project.conv_1.norm.weight : torch.Size([72])
blocks.b3_l3.project.conv_1.norm.bias : torch.Size([72])
blocks.b3_l4.alpha : torch.Size([])
blocks.b3_l4.expand.conv_1.conv3d.weight : torch.Size([144, 72, 1, 1, 1])
blocks.b3_l4.expand.conv_1.norm.weight : torch.Size([144])
blocks.b3_l4.expand.conv_1.norm.bias : torch.Size([144])
blocks.b3_l4.deep.conv_1.conv3d.weight : torch.Size([144, 1, 1, 5, 5])
blocks.b3_l4.deep.conv_1.norm.weight : torch.Size([144])
blocks.b3_l4.deep.conv_1.norm.bias : torch.Size([144])
blocks.b3_l4.se.fc1.conv_1.conv3d.weight : torch.Size([40, 144, 1, 1, 1])
blocks.b3_l4.se.fc1.conv_1.conv3d.bias : torch.Size([40])
blocks.b3_l4.se.fc2.conv_1.conv3d.weight : torch.Size([144, 40, 1, 1, 1])
blocks.b3_l4.se.fc2.conv_1.conv3d.bias : torch.Size([144])
blocks.b3_l4.project.conv_1.conv3d.weight : torch.Size([72, 144, 1, 1, 1])
blocks.b3_l4.project.conv_1.norm.weight : torch.Size([72])
blocks.b3_l4.project.conv_1.norm.bias : torch.Size([72])
blocks.b3_l5.alpha : torch.Size([])
blocks.b3_l5.expand.conv_1.conv3d.weight : torch.Size([240, 72, 1, 1, 1])
blocks.b3_l5.expand.conv_1.norm.weight : torch.Size([240])
blocks.b3_l5.expand.conv_1.norm.bias : torch.Size([240])
blocks.b3_l5.deep.conv_1.conv3d.weight : torch.Size([240, 1, 3, 3, 3])
blocks.b3_l5.deep.conv_1.norm.weight : torch.Size([240])
blocks.b3_l5.deep.conv_1.norm.bias : torch.Size([240])
blocks.b3_l5.se.fc1.conv_1.conv3d.weight : torch.Size([64, 240, 1, 1, 1])
blocks.b3_l5.se.fc1.conv_1.conv3d.bias : torch.Size([64])
blocks.b3_l5.se.fc2.conv_1.conv3d.weight : torch.Size([240, 64, 1, 1, 1])
blocks.b3_l5.se.fc2.conv_1.conv3d.bias : torch.Size([240])
blocks.b3_l5.project.conv_1.conv3d.weight : torch.Size([72, 240, 1, 1, 1])
blocks.b3_l5.project.conv_1.norm.weight : torch.Size([72])
blocks.b3_l5.project.conv_1.norm.bias : torch.Size([72])
blocks.b4_l0.alpha : torch.Size([])
blocks.b4_l0.expand.conv_1.conv3d.weight : torch.Size([480, 72, 1, 1, 1])
blocks.b4_l0.expand.conv_1.norm.weight : torch.Size([480])
blocks.b4_l0.expand.conv_1.norm.bias : torch.Size([480])
blocks.b4_l0.deep.conv_1.conv3d.weight : torch.Size([480, 1, 5, 3, 3])
blocks.b4_l0.deep.conv_1.norm.weight : torch.Size([480])
blocks.b4_l0.deep.conv_1.norm.bias : torch.Size([480])
blocks.b4_l0.se.fc1.conv_1.conv3d.weight : torch.Size([120, 480, 1, 1, 1])
blocks.b4_l0.se.fc1.conv_1.conv3d.bias : torch.Size([120])
blocks.b4_l0.se.fc2.conv_1.conv3d.weight : torch.Size([480, 120, 1, 1, 1])
blocks.b4_l0.se.fc2.conv_1.conv3d.bias : torch.Size([480])
blocks.b4_l0.project.conv_1.conv3d.weight : torch.Size([144, 480, 1, 1, 1])
blocks.b4_l0.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l0.project.conv_1.norm.bias : torch.Size([144])
blocks.b4_l0.res.1.conv_1.conv3d.weight : torch.Size([144, 72, 1, 1, 1])
blocks.b4_l0.res.1.conv_1.norm.weight : torch.Size([144])
blocks.b4_l0.res.1.conv_1.norm.bias : torch.Size([144])
blocks.b4_l1.alpha : torch.Size([])
blocks.b4_l1.expand.conv_1.conv3d.weight : torch.Size([384, 144, 1, 1, 1])
blocks.b4_l1.expand.conv_1.norm.weight : torch.Size([384])
blocks.b4_l1.expand.conv_1.norm.bias : torch.Size([384])
blocks.b4_l1.deep.conv_1.conv3d.weight : torch.Size([384, 1, 1, 5, 5])
blocks.b4_l1.deep.conv_1.norm.weight : torch.Size([384])
blocks.b4_l1.deep.conv_1.norm.bias : torch.Size([384])
blocks.b4_l1.se.fc1.conv_1.conv3d.weight : torch.Size([96, 384, 1, 1, 1])
blocks.b4_l1.se.fc1.conv_1.conv3d.bias : torch.Size([96])
blocks.b4_l1.se.fc2.conv_1.conv3d.weight : torch.Size([384, 96, 1, 1, 1])
blocks.b4_l1.se.fc2.conv_1.conv3d.bias : torch.Size([384])
blocks.b4_l1.project.conv_1.conv3d.weight : torch.Size([144, 384, 1, 1, 1])
blocks.b4_l1.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l1.project.conv_1.norm.bias : torch.Size([144])
blocks.b4_l2.alpha : torch.Size([])
blocks.b4_l2.expand.conv_1.conv3d.weight : torch.Size([384, 144, 1, 1, 1])
blocks.b4_l2.expand.conv_1.norm.weight : torch.Size([384])
blocks.b4_l2.expand.conv_1.norm.bias : torch.Size([384])
blocks.b4_l2.deep.conv_1.conv3d.weight : torch.Size([384, 1, 1, 5, 5])
blocks.b4_l2.deep.conv_1.norm.weight : torch.Size([384])
blocks.b4_l2.deep.conv_1.norm.bias : torch.Size([384])
blocks.b4_l2.se.fc1.conv_1.conv3d.weight : torch.Size([96, 384, 1, 1, 1])
blocks.b4_l2.se.fc1.conv_1.conv3d.bias : torch.Size([96])
blocks.b4_l2.se.fc2.conv_1.conv3d.weight : torch.Size([384, 96, 1, 1, 1])
blocks.b4_l2.se.fc2.conv_1.conv3d.bias : torch.Size([384])
blocks.b4_l2.project.conv_1.conv3d.weight : torch.Size([144, 384, 1, 1, 1])
blocks.b4_l2.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l2.project.conv_1.norm.bias : torch.Size([144])
blocks.b4_l3.alpha : torch.Size([])
blocks.b4_l3.expand.conv_1.conv3d.weight : torch.Size([480, 144, 1, 1, 1])
blocks.b4_l3.expand.conv_1.norm.weight : torch.Size([480])
blocks.b4_l3.expand.conv_1.norm.bias : torch.Size([480])
blocks.b4_l3.deep.conv_1.conv3d.weight : torch.Size([480, 1, 1, 5, 5])
blocks.b4_l3.deep.conv_1.norm.weight : torch.Size([480])
blocks.b4_l3.deep.conv_1.norm.bias : torch.Size([480])
blocks.b4_l3.se.fc1.conv_1.conv3d.weight : torch.Size([120, 480, 1, 1, 1])
blocks.b4_l3.se.fc1.conv_1.conv3d.bias : torch.Size([120])
blocks.b4_l3.se.fc2.conv_1.conv3d.weight : torch.Size([480, 120, 1, 1, 1])
blocks.b4_l3.se.fc2.conv_1.conv3d.bias : torch.Size([480])
blocks.b4_l3.project.conv_1.conv3d.weight : torch.Size([144, 480, 1, 1, 1])
blocks.b4_l3.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l3.project.conv_1.norm.bias : torch.Size([144])
blocks.b4_l4.alpha : torch.Size([])
blocks.b4_l4.expand.conv_1.conv3d.weight : torch.Size([480, 144, 1, 1, 1])
blocks.b4_l4.expand.conv_1.norm.weight : torch.Size([480])
blocks.b4_l4.expand.conv_1.norm.bias : torch.Size([480])
blocks.b4_l4.deep.conv_1.conv3d.weight : torch.Size([480, 1, 1, 5, 5])
blocks.b4_l4.deep.conv_1.norm.weight : torch.Size([480])
blocks.b4_l4.deep.conv_1.norm.bias : torch.Size([480])
blocks.b4_l4.se.fc1.conv_1.conv3d.weight : torch.Size([120, 480, 1, 1, 1])
blocks.b4_l4.se.fc1.conv_1.conv3d.bias : torch.Size([120])
blocks.b4_l4.se.fc2.conv_1.conv3d.weight : torch.Size([480, 120, 1, 1, 1])
blocks.b4_l4.se.fc2.conv_1.conv3d.bias : torch.Size([480])
blocks.b4_l4.project.conv_1.conv3d.weight : torch.Size([144, 480, 1, 1, 1])
blocks.b4_l4.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l4.project.conv_1.norm.bias : torch.Size([144])
blocks.b4_l5.alpha : torch.Size([])
blocks.b4_l5.expand.conv_1.conv3d.weight : torch.Size([480, 144, 1, 1, 1])
blocks.b4_l5.expand.conv_1.norm.weight : torch.Size([480])
blocks.b4_l5.expand.conv_1.norm.bias : torch.Size([480])
blocks.b4_l5.deep.conv_1.conv3d.weight : torch.Size([480, 1, 3, 3, 3])
blocks.b4_l5.deep.conv_1.norm.weight : torch.Size([480])
blocks.b4_l5.deep.conv_1.norm.bias : torch.Size([480])
blocks.b4_l5.se.fc1.conv_1.conv3d.weight : torch.Size([120, 480, 1, 1, 1])
blocks.b4_l5.se.fc1.conv_1.conv3d.bias : torch.Size([120])
blocks.b4_l5.se.fc2.conv_1.conv3d.weight : torch.Size([480, 120, 1, 1, 1])
blocks.b4_l5.se.fc2.conv_1.conv3d.bias : torch.Size([480])
blocks.b4_l5.project.conv_1.conv3d.weight : torch.Size([144, 480, 1, 1, 1])
blocks.b4_l5.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l5.project.conv_1.norm.bias : torch.Size([144])
blocks.b4_l6.alpha : torch.Size([])
blocks.b4_l6.expand.conv_1.conv3d.weight : torch.Size([576, 144, 1, 1, 1])
blocks.b4_l6.expand.conv_1.norm.weight : torch.Size([576])
blocks.b4_l6.expand.conv_1.norm.bias : torch.Size([576])
blocks.b4_l6.deep.conv_1.conv3d.weight : torch.Size([576, 1, 1, 3, 3])
blocks.b4_l6.deep.conv_1.norm.weight : torch.Size([576])
blocks.b4_l6.deep.conv_1.norm.bias : torch.Size([576])
blocks.b4_l6.se.fc1.conv_1.conv3d.weight : torch.Size([144, 576, 1, 1, 1])
blocks.b4_l6.se.fc1.conv_1.conv3d.bias : torch.Size([144])
blocks.b4_l6.se.fc2.conv_1.conv3d.weight : torch.Size([576, 144, 1, 1, 1])
blocks.b4_l6.se.fc2.conv_1.conv3d.bias : torch.Size([576])
blocks.b4_l6.project.conv_1.conv3d.weight : torch.Size([144, 576, 1, 1, 1])
blocks.b4_l6.project.conv_1.norm.weight : torch.Size([144])
blocks.b4_l6.project.conv_1.norm.bias : torch.Size([144])
conv7.conv_1.conv3d.weight : torch.Size([640, 144, 1, 1, 1])
conv7.conv_1.norm.weight : torch.Size([640])
conv7.conv_1.norm.bias : torch.Size([640])
classifier.0.conv_1.conv3d.weight : torch.Size([2048, 640, 1, 1, 1])
classifier.0.conv_1.conv3d.bias : torch.Size([2048])
classifier.3.conv_1.conv3d.weight : torch.Size([600, 2048, 1, 1, 1])
classifier.3.conv_1.conv3d.bias : torch.Size([600])
Process finished with exit code 0
解决方案
如果该层需要三个通道,那么您不能这样做。您必须提供具有三个通道的张量。
nn.Conv3d
另一种方法是用只有两个通道的新层替换第一层:
model.conv1.conv_1.conv3d = nn.Conv3d(2, 16)
推荐阅读
- flutter - 如何在 Flutter 中嵌套 StreamBuilder?
- list - 在Java8中使用多个布尔键对地图对象列表进行排序
- javascript - jquery fadeIn() fadeOut() 动画的问题
- c - 我无法弄清楚数组名称的意义
- etl - informatica 中的时间数据类型转换
- javascript - 无法调用未定义的方法“getValues”
- html - rmarkdown 中的 Javascript 代码块不起作用
- html - html中详细信息标签内的编辑框
- asp.net - 从 Kendo Upload 中排除文件类型
- html - 我需要在添加文章时,块会扩大