Search alternatives:
less decrease » we decrease (Expand Search), levels decreased (Expand Search), largest decrease (Expand Search)
teer decrease » greater decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
less decrease » we decrease (Expand Search), levels decreased (Expand Search), largest decrease (Expand Search)
teer decrease » greater decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
-
6761
Dependent and independent variables (N = 316).
Published 2025“…<div><p>Introduction</p><p>Unmet oral health needs remain a significant issue among immigrant adolescents, often exacerbated by experiences of racial discrimination. …”
-
6762
Algorithm training accuracy experiments.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6763
Repeat the detection experiment.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6764
Detection network structure with IRAU [34].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6765
Ablation experiments of various block.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6766
Kappa coefficients for different algorithms.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6767
The structure of ASPP+ block.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6768
The structure of attention gate block [31].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6769
DSC block and its application network structure.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6770
The structure of multi-scale residual block [30].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6771
The structure of IRAU and Res2Net+ block [22].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6772
qorA associated data
Published 2025“…Transcriptomic and metabolic analyses indicated that the Δ<i>qorA</i> strain underwent a significant metabolic shift to cope with the redox imbalance. …”
-
6773
-
6774
-
6775
-
6776
-
6777
-
6778
-
6779
-
6780