Showing 1 - 20 results of 2,215 for search '(( i ((largest decrease) OR (marked decrease)) ) OR ( shows mae decrease ))', query time: 0.63s Refine Results
  1. 1
  2. 2
  3. 3

    The MAE value of the model under raw data. by Xiangjuan Liu (618000)

    Published 2025
    “…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9

    S1 File - by Hongyu Li (1332669)

    Published 2025
    “…Following the overexpression of miRNA 221 in myocardium, there was a marked alleviation of myocardial injury and cardiomyocyte apoptosis and necrosis, significant enhancement of left ventricular systolic function, and marked decrease in the levels of PLB, p-PLB (Ser16), p-PLB (Thr17), caspase 3 and Cyt C, as well as a significant decrease in total calcium levels in myocardium.…”
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17

    Testing set error. by Xiangjuan Liu (618000)

    Published 2025
    “…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
  18. 18

    Internal structure of an LSTM cell. by Xiangjuan Liu (618000)

    Published 2025
    “…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
  19. 19

    Prediction effect of each model after STL. by Xiangjuan Liu (618000)

    Published 2025
    “…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”
  20. 20

    The kernel density plot for data of each feature. by Xiangjuan Liu (618000)

    Published 2025
    “…Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. …”