Tan_pytorch_segmentation/pytorch_segmentation/PV_Attention/CBAM Attention.py

20 lines
1.3 KiB
Python
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

"""
CBAM: Convolutional Block Attention Module---ECCV2018
论文地址https://openaccess.thecvf.com/content_ECCV_2018/papers/Sanghyun_Woo_Convolutional_Block_Attention_ECCV_2018_paper.pdf
这是ECCV2018的一篇论文这篇文章同时使用了Channel Attention和Spatial Attention将两者进行了串联文章也做了并联和两种串联方式的消融实验
Channel Attention方面大致结构还是和SE相似不过作者提出AvgPool和MaxPool有不同的表示效果所以作者对原来的特征在Spatial维度分别进行了AvgPool和MaxPool
后用SE的结构提取channel attention注意这里是参数共享的然后将两个特征相加后做归一化就得到了注意力矩阵。
Spatial Attention和Channel Attention类似先在channel维度进行两种pool后将两个特征进行拼接然后用7x7的卷积来提取Spatial Attention
之所以用7x7是因为提取的是空间注意力所以用的卷积核必须足够大。然后做一次归一化就得到了空间的注意力矩阵。
"""
from attention.CBAM import CBAMBlock
import torch
input = torch.randn(50, 512, 7, 7)
kernel_size = input.shape[2]
cbam = CBAMBlock(channel=512, reduction=16, kernel_size=kernel_size)
output = cbam(input)
print(output.shape)