Showing 1 - 20 results of 820 for search '(( significant attention decrease ) OR ( significant ((inter decrease) OR (nn decrease)) ))', query time: 0.58s Refine Results
  1. 1
  2. 2

    ECoG timescales decrease during spatial attention. by Isabel Raposo (21615517)

    Published 2025
    “…Bottom: timescales significantly decrease during covert attention relative to the attend-out condition (two locations: <i>p</i> = 0.0244; four locations: <i>p</i> < 0.0001; mean ± SEM; whiskers indicate maximum and minimum; dots correspond to individual electrodes). …”
  3. 3

    Intra- and inter-day precision and accuracy. by Ewa Paszkowska (21246702)

    Published 2025
    “…<div><p>Purpose</p><p>Statins are the most commonly used drugs worldwide. Besides a significant decrease in cardiovascular diseases (CVDs) risk, the use of statins is also connected with a broad beneficial pleiotropic effect. …”
  4. 4
  5. 5
  6. 6
  7. 7

    Global Land Use Change Impacts on Soil Nitrogen Availability and Environmental Losses by Jing Wang (6206297)

    Published 2025
    “…In contrast, reversing managed to natural ecosystems significantly increased NNM by 20% (9.7, 25.4%) and decreased NN by 89% (−125, −46%), indicating increasing N availability while decreasing potential N loss. …”
  8. 8

    EMA attention mechanism working principle. by Pingping Yan (462509)

    Published 2025
    “…Furthermore, Parameters and GFLOPs were reduced by 10.0% and 23.2%, respectively, indicating a significant enhancement in detection accuracy along with a substantial decrease in both parameters and computational costs. …”
  9. 9

    Sensitivity analysis for inter-shift subscale. by Rong Pi (21743379)

    Published 2025
    “…Studies have consistently linked occupational fatigue to decreased productivity, heightened error rates, and compromised decision-making abilities, posing significant risks to both individual nurses and healthcare organizations. …”
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17

    The structure of attention gate block [31]. by Yingying Liu (360782)

    Published 2025
    “…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
  18. 18
  19. 19

    PCA-CGAN model parameter settings. by Chao Tang (10925)

    Published 2025
    “…Using the Transformer’s global attention mechanism, the model precisely captures key diagnostic features of various arrhythmias, maximizing inter-class differences while maintaining intra-class consistency. …”
  20. 20

    MIT-BIH dataset proportion analysis chart. by Chao Tang (10925)

    Published 2025
    “…Using the Transformer’s global attention mechanism, the model precisely captures key diagnostic features of various arrhythmias, maximizing inter-class differences while maintaining intra-class consistency. …”