Showing 1,581 - 1,600 results of 7,231 for search '(( significant decrease decrease ) OR ( significantly reduced decrease ))~', query time: 0.31s Refine Results
  1. 1581

    Definitions of variables and measurements. by Getachew Magnar Kitila (19935139)

    Published 2024
    “…The empirical findings show that greater trade openness is associated with significantly higher CO2 emission, additionally; it demonstrates that the influence is heterogeneous across different CO2 emission quantiles in African countries. …”
  2. 1582

    Analysis of STL-PCA prediction results. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  3. 1583

    Accumulated contribution rate of PCA. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  4. 1584

    Regression estimates: Double threshold model. by Getachew Magnar Kitila (19935139)

    Published 2024
    “…The empirical findings show that greater trade openness is associated with significantly higher CO2 emission, additionally; it demonstrates that the influence is heterogeneous across different CO2 emission quantiles in African countries. …”
  5. 1585

    Figure of ablation experiment. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  6. 1586

    Results from cross sectional dependence test. by Getachew Magnar Kitila (19935139)

    Published 2024
    “…The empirical findings show that greater trade openness is associated with significantly higher CO2 emission, additionally; it demonstrates that the influence is heterogeneous across different CO2 emission quantiles in African countries. …”
  7. 1587

    Flowchart of the STL-PCA-BWO-BiLSTM model. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  8. 1588

    Parameter optimization results of BiLSTM. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  9. 1589

    Descriptive statistical analysis of data. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  10. 1590

    The MAE value of the model under raw data. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  11. 1591

    Three error values under raw data. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  12. 1592

    Panel quantile regression results. by Getachew Magnar Kitila (19935139)

    Published 2024
    “…The empirical findings show that greater trade openness is associated with significantly higher CO2 emission, additionally; it demonstrates that the influence is heterogeneous across different CO2 emission quantiles in African countries. …”
  13. 1593

    Decomposition of time scries plot. by Xiangjuan Liu (618000)

    Published 2025
    “…First, seven benchmark models including Prophet, ARIMA, and LSTM were applied to raw price series, where results demonstrated that deep learning models significantly outperformed traditional methods. Subsequently, STL decomposition decoupled the series into trend, seasonal, and residual components for component-specific modeling, achieving a 22.6% reduction in average MAE compared to raw data modeling. …”
  14. 1594
  15. 1595
  16. 1596
  17. 1597
  18. 1598
  19. 1599
  20. 1600