Showing 101 - 120 results of 31,867 for search '(( significant changes decrease ) OR ( significant attention target ))', query time: 0.93s Refine Results
  1. 101
  2. 102
  3. 103

    Blueprint of attention-based GRU model structure. by Yankun Jiang (5690342)

    Published 2024
    “…Additionally, with its attention mechanism, the CIMA-AttGRU targets the issue of non-linear patterns by allowing dynamic adjustment to temporal dependencies, offering differential learning capabilities crucial for capturing the nuanced fluctuations in futures prices. …”
  4. 104
  5. 105

    FFCA attention module structure. by Ze Wei (556526)

    Published 2025
    “…Additionally, the WIPIoU loss function is developed to calculate IoU using auxiliary boundaries and address low-quality data, improving small-target recognition and accelerating convergence. Experimental results demonstrate significant improvements in PCB defect detection, with mAP50 increasing by 5.7%, and reductions of 13.3% and 14.8% in model parameters and computational complexity, respectively. …”
  6. 106
  7. 107
  8. 108
  9. 109
  10. 110
  11. 111
  12. 112
  13. 113

    Network attention of attractions. by Feng Yuxin (18275639)

    Published 2024
    “…Research shows that: (1) Overall, the network attention to case-based destinations is relatively low, and there are significant differences in network attention among different attractions. …”
  14. 114

    Distribution of network attention. by Feng Yuxin (18275639)

    Published 2024
    “…Research shows that: (1) Overall, the network attention to case-based destinations is relatively low, and there are significant differences in network attention among different attractions. …”
  15. 115
  16. 116
  17. 117
  18. 118
  19. 119
  20. 120

    Structure of Spatial Attention Block. by Guangjie Liu (141451)

    Published 2024
    “…Furthermore, we devised the Channel And Spatial Attention Block (CSAB) to enhance the target location information during the encoding and decoding stages. …”