Tan_pytorch_segmentation/pytorch_segmentation/PV_Attention/External Attention Usage.py

18 lines
831 B
Python
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

from attention.ExternalAttention import ExternalAttention
import torch
input=torch.randn(50,49,512)
ea = ExternalAttention(d_model=512,S=8)
output=ea(input)
print(output.shape)
"""
1.1. 引用
Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks.---arXiv 2021.05.05
论文地址https://arxiv.org/abs/2105.02358
这是五月份在arXiv上的一篇文章主要解决的Self-Attention(SA)的两个痛点问题:
1O(n^2)的计算复杂度;(2)SA是在同一个样本上根据不同位置计算Attention忽略了不同样本之间的联系。
因此本文采用了两个串联的MLP结构作为memory units使得计算复杂度降低到了O(n)
此外这两个memory units是基于全部的训练数据学习的因此也隐式的考虑了不同样本之间的联系。
"""