Flatten layer of PyTorch build by sequential container(通过顺序容器扁平化 PyTorch 构建层)
问题描述
我正在尝试通过 PyTorch 的顺序容器构建一个 cnn,我的问题是我不知道如何展平图层.
I am trying to build a cnn by sequential container of PyTorch, my problem is I cannot figure out how to flatten the layer.
main = nn.Sequential()
self._conv_block(main, 'conv_0', 3, 6, 5)
main.add_module('max_pool_0_2_2', nn.MaxPool2d(2,2))
self._conv_block(main, 'conv_1', 6, 16, 3)
main.add_module('max_pool_1_2_2', nn.MaxPool2d(2,2))
main.add_module('flatten', make_it_flatten)
我应该在make_it_flatten"中放什么?我试图压平主要但它不起作用,主要不存在调用视图的东西
What should I put in the "make_it_flatten"? I tried to flatten the main but it do not work, main do not exist something call view
main = main.view(-1, 16*3*3)
推荐答案
这可能不是您想要的,但您可以简单地创建自己的 nn.Module
来扁平化任何输入,然后您可以将其添加到 nn.Sequential()
对象:
This might not be exactly what you are looking for, but you can simply create your own nn.Module
that flattens any input, which you can then add to the nn.Sequential()
object:
class Flatten(nn.Module):
def forward(self, x):
return x.view(x.size()[0], -1)
x.size()[0]
将选择批量暗淡,而 -1
将计算所有剩余的暗淡以适应元素的数量,从而展平任何张量/变量.
The x.size()[0]
will select the batch dim, and -1
will compute all remaining dims to fit the number of elements, thereby flattening any tensor/Variable.
并在 nn.Sequential
中使用它:
main = nn.Sequential()
self._conv_block(main, 'conv_0', 3, 6, 5)
main.add_module('max_pool_0_2_2', nn.MaxPool2d(2,2))
self._conv_block(main, 'conv_1', 6, 16, 3)
main.add_module('max_pool_1_2_2', nn.MaxPool2d(2,2))
main.add_module('flatten', Flatten())
这篇关于通过顺序容器扁平化 PyTorch 构建层的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:通过顺序容器扁平化 PyTorch 构建层


- 我如何卸载 PyTorch? 2022-01-01
- 检查具有纬度和经度的地理点是否在 shapefile 中 2022-01-01
- CTR 中的 AES 如何用于 Python 和 PyCrypto? 2022-01-01
- 我如何透明地重定向一个Python导入? 2022-01-01
- YouTube API v3 返回截断的观看记录 2022-01-01
- 使用 Cython 将 Python 链接到共享库 2022-01-01
- ";find_element_by_name(';name';)";和&QOOT;FIND_ELEMENT(BY NAME,';NAME';)";之间有什么区别? 2022-01-01
- 使用公司代理使Python3.x Slack(松弛客户端) 2022-01-01
- 计算测试数量的Python单元测试 2022-01-01
- 如何使用PYSPARK从Spark获得批次行 2022-01-01