20 lines
774 B
Python
20 lines
774 B
Python
|
|
|||
|
"""
|
|||
|
Squeeze-and-Excitation Networks---CVPR2018
|
|||
|
|
|||
|
论文地址:https://arxiv.org/abs/1709.01507
|
|||
|
|
|||
|
这是CVPR2018的一篇文章,同样非常具有影响力,
|
|||
|
目前引用量7k+。本文是做通道注意力的,因其简单的结构和有效性,
|
|||
|
将通道注意力掀起了一波小高潮。大道至简,这篇文章的思想可以说非常简单,
|
|||
|
首先将spatial维度进行AdaptiveAvgPool,然后通过两个FC学习到通道注意力,
|
|||
|
并用Sigmoid进行归一化得到Channel Attention Map,最后将Channel Attention Map与原特征相乘,就得到了加权后的特征。
|
|||
|
|
|||
|
"""
|
|||
|
from attention.SEAttention import SEAttention
|
|||
|
import torch
|
|||
|
|
|||
|
input = torch.randn(50, 512, 7, 7)
|
|||
|
se = SEAttention(channel=512, reduction=8)
|
|||
|
output = se(input)
|
|||
|
print(output.shape)
|