Search alternatives:
significant decrease » significant increase (Expand Search), significantly increased (Expand Search)
greatest decrease » treatment decreased (Expand Search), greater increase (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), nn decrease (Expand Search)
significant decrease » significant increase (Expand Search), significantly increased (Expand Search)
greatest decrease » treatment decreased (Expand Search), greater increase (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), nn decrease (Expand Search)
-
1201
-
1202
-
1203
-
1204
-
1205
-
1206
-
1207
-
1208
-
1209
-
1210
-
1211
-
1212
-
1213
-
1214
-
1215
Upper-crust thermal evolution of the Patagonian Precordillera basement (Argentina): insights from fission track, (U-Th)/He thermochronology and geodynamic significance
Published 2025“…Most thermal models show similar decreasing time-temperature paths (t-T), from which three stages are distinguished. …”
-
1216
S1 File -
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1217
Confusion matrix for ClinicalBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1218
Confusion matrix for LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1219
Student model architecture.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1220
Configuration of the LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”