Showing 1 - 20 results of 10,000 for search '(( significant largest decrease ) OR ( significant attention based ))', query time: 1.08s Refine Results
  1. 1
  2. 2
  3. 3

    Layer attention. by Zheng Ye (15102)

    Published 2024
    Subjects:
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9

    Self-attention. by Shuang Hong (17558808)

    Published 2024
    Subjects:
  10. 10
  11. 11
  12. 12
  13. 13

    Coordinate attention. by Chao Lv (335727)

    Published 2025
    “…To address these issues, we propose PoseNet++, a novel approach based on a 3-stacked hourglass architecture, incorporating three key innovations: the multi-scale spatial pyramid attention hourglass module (MSPAHM), coordinate-channel prior convolutional attention (C-CPCA), and the PinSK Bottleneck Residual Module (PBRM). …”
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20