Search alternatives:
largest decrease » marked decrease (Expand Search)
larger decrease » marked decrease (Expand Search)
set decrease » step decrease (Expand Search), sizes decrease (Expand Search), mean decrease (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), nn decrease (Expand Search)
largest decrease » marked decrease (Expand Search)
larger decrease » marked decrease (Expand Search)
set decrease » step decrease (Expand Search), sizes decrease (Expand Search), mean decrease (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), nn decrease (Expand Search)
-
1
-
2
-
3
-
4
-
5
-
6
-
7
-
8
-
9
-
10
Prediction (a-d) and infusion deviation (e-f) results under different training sets and test sets.
Published 2025Subjects: -
11
-
12
-
13
<b>Nest mass in forest tits </b><b><i>Paridae</i></b><b> </b><b>increases with elevation and decreasing body mass, promoting reproductive success</b>
Published 2025“…We found that nest mass increased by ~ 60% along the elevational gradient, but the effect of canopy openness on nest mass was not significant, while nest mass decreased along the ranked species from the smallest <i>Periparus ater</i> to the medium-sized <i>Cyanistes caeruleus</i> and the largest <i>Parus major</i>. …”
-
14
-
15
Biases in larger populations.
Published 2025“…Threshold parameter <i>c</i> = − 0 . 1 for the rectified cosine tuning with 4 neurons, and width <i>w</i> was 1 for von Mises tuning. …”
-
16
Classification model parameter settings.
Published 2025“…Simultaneously, we designed a two-stage conditional encoding-decoding architecture that builds category-independent feature spaces from early training stages, fundamentally breaking the feature space bias caused by the “Matthew effect” and effectively preventing majority classes from compressing minority class features during generation. …”
-
17
-
18
PCA-CGAN model parameter settings.
Published 2025“…Simultaneously, we designed a two-stage conditional encoding-decoding architecture that builds category-independent feature spaces from early training stages, fundamentally breaking the feature space bias caused by the “Matthew effect” and effectively preventing majority classes from compressing minority class features during generation. …”
-
19
-
20