Search alternatives:
step decrease » sizes decrease (Expand Search), teer decrease (Expand Search), we decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
step decrease » sizes decrease (Expand Search), teer decrease (Expand Search), we decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
-
7321
Repeat the detection experiment.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7322
Detection network structure with IRAU [34].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7323
Ablation experiments of various block.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7324
Kappa coefficients for different algorithms.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7325
The structure of ASPP+ block.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7326
The structure of attention gate block [31].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7327
DSC block and its application network structure.
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7328
The structure of multi-scale residual block [30].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7329
The structure of IRAU and Res2Net+ block [22].
Published 2025“…However, after removing the integrated residual attention unit and depth-wise separable convolution, the accuracy decreased by 1.91% and the latency increased by 117ms. …”
-
7330
qorA associated data
Published 2025“…Transcriptomic and metabolic analyses indicated that the Δ<i>qorA</i> strain underwent a significant metabolic shift to cope with the redox imbalance. …”
-
7331
-
7332
-
7333
-
7334
The result of molecular dynamics simulation.
Published 2025“…The docking research indicated that these mutations decreased the binding affinity for DNA, with R273C, R280G, G266E, and G105C displaying the most significant differences. …”
-
7335
Result of the phenotypic analysis.
Published 2025“…The docking research indicated that these mutations decreased the binding affinity for DNA, with R273C, R280G, G266E, and G105C displaying the most significant differences. …”
-
7336
Type of Mutations.
Published 2025“…The docking research indicated that these mutations decreased the binding affinity for DNA, with R273C, R280G, G266E, and G105C displaying the most significant differences. …”
-
7337
-
7338
-
7339
-
7340