Search alternatives:
linear decrease » linear increase (Expand Search)
larger decrease » marked decrease (Expand Search)
we decrease » _ decrease (Expand Search), nn decrease (Expand Search), mean decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
linear decrease » linear increase (Expand Search)
larger decrease » marked decrease (Expand Search)
we decrease » _ decrease (Expand Search), nn decrease (Expand Search), mean decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
-
241
<i>Aedes aegypti</i> database from SAGO traps.
Published 2025“…IVM treatment reduced the number of females per trap per week from 3.29 ± 0.24 to 2.41 ± 0.20 (33.7% reduction), AGO from 1.58 ± 0.17 to 0.25 ± 0.05 (85.2% reduction), and AGO + IVM from 1.49 ± 0.17 to 0.53 ± 0.08 (67.78% reduction), based on Henderson’s formula. We observed a non-significant increase in the control area (no treatment provided) in the mosquito populations, increasing from 2.94 ± 0.24 in the pretreatment period to 3.25 ± 0.28 of the post treatment period.…”
-
242
Data for the G7 countries in 2022.
Published 2024“…The results indicate that only France displays a significant negative trend and thus a continuous decrease in the level of alcohol consumption. …”
-
243
Time series plots: Alcohol consumption.
Published 2024“…The results indicate that only France displays a significant negative trend and thus a continuous decrease in the level of alcohol consumption. …”
-
244
-
245
-
246
Geometric manifold comparison visualization
Published 2025“…In this work, we propose to use a generative non-linear deep learning model, a disentangled variational autoencoder (DSVAE), that factorizes out window-specific (context) information from timestep-specific (local) information. …”
-
247
Hyperparameter ranges
Published 2025“…In this work, we propose to use a generative non-linear deep learning model, a disentangled variational autoencoder (DSVAE), that factorizes out window-specific (context) information from timestep-specific (local) information. …”
-
248
Convolutional vs RNN context encoder
Published 2025“…In this work, we propose to use a generative non-linear deep learning model, a disentangled variational autoencoder (DSVAE), that factorizes out window-specific (context) information from timestep-specific (local) information. …”
-
249
-
250
Average days to heal a wound in 2022 and 2023.
Published 2025“…DUs also improved, with area reduction increasing from 4.8 cm² to 15.3 cm² and a 23.8% faster reduction time, while larger DUs (>2 cm²) saw a 32.6-day decrease in time to improvement.…”
-
251
-
252
-
253
-
254
-
255
Comparison of questionnaire scores among genotypes for the significant associations.
Published 2025Subjects: -
256
-
257
Detailed information of the observation datasets.
Published 2025“…On longer time scales (6–24 hours), the score and correlation between ERA5 and observations further increased, while the centered root-mean-square error (CRMSE) and standard deviation decrease. 4) Hourly wind data with a regular spatial distribution in ERA5 reanalysis provides valuable information for further detailed research on meteorology or renewable energy perspectives, but some inherent shortcomings should be considered.…”
-
258
General technical specification for GW154/6700.
Published 2025“…On longer time scales (6–24 hours), the score and correlation between ERA5 and observations further increased, while the centered root-mean-square error (CRMSE) and standard deviation decrease. 4) Hourly wind data with a regular spatial distribution in ERA5 reanalysis provides valuable information for further detailed research on meteorology or renewable energy perspectives, but some inherent shortcomings should be considered.…”
-
259
-
260