Showing 1 - 20 results of 5,453 for search '(((( learning test decrease ) OR ( a larger decrease ))) OR ( i values decrease ))', query time: 0.59s Refine Results
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6

    The introduction of mutualisms into assembled communities increases their connectance and complexity while decreasing their richness. by Gui Araujo (22170819)

    Published 2025
    “…Parameter values: interaction strengths were drawn from a half-normal distribution of zero mean and a standard deviation of 0.2, and strength for consumers was made no larger than the strength for resources. …”
  7. 7
  8. 8
  9. 9
  10. 10

    Scheme of g-λ model with larger values λ. by Zhanfeng Fan (20390992)

    Published 2024
    “…The stress-deformation model of the single uncoupled joint (g-λ model with λ ≥ 1) is employed to depict the nonlinearity of uncoupled joints, with a greater value of the parameter λ signifying a lower degree of non-linearity in the joint model curve. …”
  11. 11
  12. 12

    Biases in larger populations. by Sander W. Keemink (21253563)

    Published 2025
    “…<p>(<b>A</b>) Maximum absolute bias vs the number of neurons in the population for the Bayesian decoder. …”
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19

    The MAE value of the model under raw data. by Xiangjuan Liu (618000)

    Published 2025
    “…Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. Finally, the Beluga Whale Optimization (BWO)-tuned STL-PCA-BWO-BiLSTM hybrid model delivered optimal performance on test sets (RMSE = 0.22, MAE = 0.16, MAPE = 0.99%, ), exhibiting 40.7% higher accuracy than unoptimized BiLSTM (MAE = 0.27). …”
  20. 20

    Three error values under raw data. by Xiangjuan Liu (618000)

    Published 2025
    “…Further integration of Spearman correlation analysis and PCA dimensionality reduction created multidimensional feature sets, revealing substantial accuracy improvements: The BiLSTM model achieved an 83.6% cumulative MAE reduction from 1.65 (raw data) to 0.27 (STL-PCA), while traditional models like Prophet showed an 82.2% MAE decrease after feature engineering optimization. Finally, the Beluga Whale Optimization (BWO)-tuned STL-PCA-BWO-BiLSTM hybrid model delivered optimal performance on test sets (RMSE = 0.22, MAE = 0.16, MAPE = 0.99%, ), exhibiting 40.7% higher accuracy than unoptimized BiLSTM (MAE = 0.27). …”