Search alternatives:
marked decrease » marked increase (Expand Search)
mae decrease » mean decrease (Expand Search), rate decreased (Expand Search), _ decrease (Expand Search)
we decrease » _ decrease (Expand Search), nn decrease (Expand Search), mean decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
shows mae » shows a (Expand Search), show me (Expand Search)
marked decrease » marked increase (Expand Search)
mae decrease » mean decrease (Expand Search), rate decreased (Expand Search), _ decrease (Expand Search)
we decrease » _ decrease (Expand Search), nn decrease (Expand Search), mean decrease (Expand Search)
a decrease » _ decrease (Expand Search), _ decreased (Expand Search), _ decreases (Expand Search)
shows mae » shows a (Expand Search), show me (Expand Search)
-
1
-
2
-
3
The MAE value of the model under raw data.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
4
-
5
DataSheet1_Decreasing viscosity and increasing accessible load by replacing classical diluents with a hydrotrope in liquid–liquid extraction.docx
Published 2025“…We show that using hydrotropes as a diluent decreases the viscosity of solutions by more than a factor of ten, even under high load by extracted cations. …”
-
6
Maternal group B <i>Streptococcus</i> decreases infant length and alters the early-life microbiome: a prospective cohort study
Published 2024“…This study aimed to explore the effects of maternal vaginal GBS during pregnancy on early infant growth, microbiome, and metabolomics.</p> <p>We recruited and classified 453 pregnant women from southern China into GBS or healthy groups based on GBS vaginal colonization. …”
-
7
Participants enrollment.
Published 2025“…<div><p>Hypertension is a widespread and life-threatening condition affecting one-third of adults globally. …”
-
8
KAP assessment scores (n = 422).
Published 2025“…<div><p>Hypertension is a widespread and life-threatening condition affecting one-third of adults globally. …”
-
9
Testing set error.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
10
Internal structure of an LSTM cell.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
11
Prediction effect of each model after STL.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
12
The kernel density plot for data of each feature.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
13
Analysis of raw data prediction results.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
14
Flowchart of the STL.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
15
SARIMA predicts season components.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
16
BWO-BiLSTM model prediction results.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
17
Bi-LSTM architecture diagram.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
18
STL Linear Combination Forecast Graph.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
19
LOSS curves for BWO-BiLSTM model training.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
-
20
Analysis of STL-PCA prediction results.
Published 2025“…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”