Tan_pytorch_segmentation/pytorch_segmentation/PV_Attention/BAM Attention.py

18 lines
894 B
Python
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

"""
BAM: Bottleneck Attention Module---BMCV2018
论文地址https://arxiv.org/pdf/1807.06514.pdf
这是CBAM同作者同时期的工作工作与CBAM非常相似也是双重Attention不同的是CBAM是将两个attention的结果串联而BAM是直接将两个attention矩阵进行相加。
Channel Attention方面与SE的结构基本一样。Spatial Attention方面还是在通道维度进行pool然后用了两次3x3的空洞卷积最后将用一次1x1的卷积得到Spatial Attention的矩阵。
最后Channel Attention和Spatial Attention矩阵进行相加这里用到了广播机制并进行归一化这样一来就得到了空间和通道结合的attention矩阵。
"""
from attention.BAM import BAMBlock
import torch
input = torch.randn(50, 512, 7, 7)
bam = BAMBlock(channel=512, reduction=16, dia_val=2)
output = bam(input)
print(output.shape)