Tan_pytorch_segmentation/pytorch_segmentation/PV_Attention/Squeeze-and-Excitation(SE) ...

20 lines
774 B
Python
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

"""
Squeeze-and-Excitation Networks---CVPR2018
论文地址https://arxiv.org/abs/1709.01507
这是CVPR2018的一篇文章同样非常具有影响力
目前引用量7k+。本文是做通道注意力的,因其简单的结构和有效性,
将通道注意力掀起了一波小高潮。大道至简,这篇文章的思想可以说非常简单,
首先将spatial维度进行AdaptiveAvgPool然后通过两个FC学习到通道注意力
并用Sigmoid进行归一化得到Channel Attention Map,最后将Channel Attention Map与原特征相乘就得到了加权后的特征。
"""
from attention.SEAttention import SEAttention
import torch
input = torch.randn(50, 512, 7, 7)
se = SEAttention(channel=512, reduction=8)
output = se(input)
print(output.shape)