Showing 1 - 20 results of 7,986 for search '(( significant ((largest decrease) OR (marked decrease)) ) OR ( significant attention maps ))', query time: 0.56s Refine Results
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15

    Attentional visualization of different vehicles. by Rui Liu (54031)

    Published 2025
    “…The DWAN introduces a four-level discrete wavelet transform in the convolutional neural network architecture and combines it with Convolutional Block Attention Module (CBAM) to efficiently capture multiscale feature information. …”
  16. 16

    Comparison of different attention mechanisms. by Rui Liu (54031)

    Published 2025
    “…The DWAN introduces a four-level discrete wavelet transform in the convolutional neural network architecture and combines it with Convolutional Block Attention Module (CBAM) to efficiently capture multiscale feature information. …”
  17. 17

    Spatial attention module. by Mahmood Ashraf (18340154)

    Published 2024
    “…These methods also face difficulties in manipulating information from local intrinsic detailed patterns of feature maps and low-rank frequency feature tuning. To overcome these challenges and improve HSI classification performance, we propose an innovative approach called the Attention 3D Central Difference Convolutional Dense Network (3D-CDC Attention DenseNet). …”
  18. 18

    The structure of deformable attention. by Kaixin Deng (11945951)

    Published 2025
    “…Specifically, OS-DETR achieves a Precision of 95.0%, Recall of 94.2%, mAP@50 of 95.7%, and mAP@50:95 of 74.2%. The code implementation and experimental results are available at <a href="https://github.com/dkx2077/OS-DETR.git" target="_blank">https://github.com/dkx2077/OS-DETR.git</a>.…”
  19. 19

    Residual hybrid domain attention mechanism. by Rui Liu (54031)

    Published 2025
    “…The DWAN introduces a four-level discrete wavelet transform in the convolutional neural network architecture and combines it with Convolutional Block Attention Module (CBAM) to efficiently capture multiscale feature information. …”
  20. 20