Search alternatives:
largest decrease » larger decrease (Expand Search), marked decrease (Expand Search)
less decrease » teer decrease (Expand Search), we decrease (Expand Search), levels decreased (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
largest decrease » larger decrease (Expand Search), marked decrease (Expand Search)
less decrease » teer decrease (Expand Search), we decrease (Expand Search), levels decreased (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
-
6821
Algorithm training accuracy experiments.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6822
Repeat the detection experiment.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6823
Detection network structure with IRAU [34].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6824
Ablation experiments of various block.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6825
Kappa coefficients for different algorithms.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6826
The structure of ASPP+ block.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6827
The structure of attention gate block [31].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6828
DSC block and its application network structure.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6829
The structure of multi-scale residual block [30].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6830
The structure of IRAU and Res2Net+ block [22].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
6831
qorA associated data
Published 2025“…Transcriptomic and metabolic analyses indicated that the Δ<i>qorA</i> strain underwent a significant metabolic shift to cope with the redox imbalance. …”
-
6832
-
6833
-
6834
-
6835
-
6836
-
6837
-
6838
-
6839
-
6840