Search alternatives:
significant attention » significant potential (Expand Search), significant reduction (Expand Search)
inter decrease » linear decrease (Expand Search), water decreases (Expand Search), teer decrease (Expand Search)
nn decrease » _ decrease (Expand Search), a decrease (Expand Search), mean decrease (Expand Search)
significant attention » significant potential (Expand Search), significant reduction (Expand Search)
inter decrease » linear decrease (Expand Search), water decreases (Expand Search), teer decrease (Expand Search)
nn decrease » _ decrease (Expand Search), a decrease (Expand Search), mean decrease (Expand Search)
-
1
-
2
ECoG timescales decrease during spatial attention.
Published 2025“…Bottom: timescales significantly decrease during covert attention relative to the attend-out condition (two locations: <i>p</i> = 0.0244; four locations: <i>p</i> < 0.0001; mean ± SEM; whiskers indicate maximum and minimum; dots correspond to individual electrodes). …”
-
3
Intra- and inter-day precision and accuracy.
Published 2025“…<div><p>Purpose</p><p>Statins are the most commonly used drugs worldwide. Besides a significant decrease in cardiovascular diseases (CVDs) risk, the use of statins is also connected with a broad beneficial pleiotropic effect. …”
-
4
-
5
-
6
-
7
Global Land Use Change Impacts on Soil Nitrogen Availability and Environmental Losses
Published 2025“…In contrast, reversing managed to natural ecosystems significantly increased NNM by 20% (9.7, 25.4%) and decreased NN by 89% (−125, −46%), indicating increasing N availability while decreasing potential N loss. …”
-
8
EMA attention mechanism working principle.
Published 2025“…Furthermore, Parameters and GFLOPs were reduced by 10.0% and 23.2%, respectively, indicating a significant enhancement in detection accuracy along with a substantial decrease in both parameters and computational costs. …”
-
9
Sensitivity analysis for inter-shift subscale.
Published 2025“…Studies have consistently linked occupational fatigue to decreased productivity, heightened error rates, and compromised decision-making abilities, posing significant risks to both individual nurses and healthcare organizations. …”
-
10
-
11
-
12
-
13
-
14
-
15
-
16
-
17
The structure of attention gate block [31].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
18
-
19
PCA-CGAN model parameter settings.
Published 2025“…Using the Transformer’s global attention mechanism, the model precisely captures key diagnostic features of various arrhythmias, maximizing inter-class differences while maintaining intra-class consistency. …”
-
20
MIT-BIH dataset proportion analysis chart.
Published 2025“…Using the Transformer’s global attention mechanism, the model precisely captures key diagnostic features of various arrhythmias, maximizing inter-class differences while maintaining intra-class consistency. …”