Showing 1,101 - 1,120 results of 21,342 for search '(( significance ((test decrease) OR (greatest decrease)) ) OR ( significant decrease decrease ))', query time: 0.52s Refine Results
  1. 1101
  2. 1102
  3. 1103
  4. 1104
  5. 1105
  6. 1106
  7. 1107
  8. 1108
  9. 1109
  10. 1110
  11. 1111
  12. 1112
  13. 1113
  14. 1114
  15. 1115

    Markov model. by Yiping An (20609789)

    Published 2025
    Subjects:
  16. 1116
  17. 1117
  18. 1118
  19. 1119

    Internal structure of an LSTM cell. by Xiangjuan Liu (618000)

    Published 2025
    “…Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. Finally, the Beluga Whale Optimization (BWO)-tuned STL-PCA-BWO-BiLSTM hybrid model delivered optimal performance on test sets (RMSE = 0.22, MAE = 0.16, MAPE = 0.99%, ), exhibiting 40.7% higher accuracy than unoptimized BiLSTM (MAE = 0.27). …”
  20. 1120

    Prediction effect of each model after STL. by Xiangjuan Liu (618000)

    Published 2025
    “…Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. Finally, the Beluga Whale Optimization (BWO)-tuned STL-PCA-BWO-BiLSTM hybrid model delivered optimal performance on test sets (RMSE = 0.22, MAE = 0.16, MAPE = 0.99%, ), exhibiting 40.7% higher accuracy than unoptimized BiLSTM (MAE = 0.27). …”