Showing 21 - 40 results of 31,178 for search '(( significant ((changes decrease) OR (largest decrease)) ) OR ( significant attention module ))', query time: 0.40s Refine Results
  1. 21
  2. 22
  3. 23
  4. 24
  5. 25
  6. 26

    Feature attention module. by Ning Zhang (23771)

    Published 2025
    “…The fused features, combined with those processed by the convolutional module, are fed into an attention layer. This attention layer assigns weights to the features, facilitating accurate final classification. …”
  7. 27

    Feature attention module. by Ning Zhang (23771)

    Published 2025
    “…The fused features, combined with those processed by the convolutional module, are fed into an attention layer. This attention layer assigns weights to the features, facilitating accurate final classification. …”
  8. 28
  9. 29
  10. 30
  11. 31
  12. 32
  13. 33
  14. 34
  15. 35
  16. 36

    Self-attention module for the features learning. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…After that, the second module is designed based on the self-attention mechanism. …”
  17. 37
  18. 38
  19. 39
  20. 40

    FFCA attention module structure. by Ze Wei (556526)

    Published 2025
    “…For feature fusion, we propose the FFCA attention module, designed to handle PCB surface defect characteristics by fusing multi-scale local features. …”