Search alternatives:
processes optimization » process optimization (Expand Search), process optimisation (Expand Search), property optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
image processes » damage processes (Expand Search), image processing (Expand Search), change processes (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
a model » _ model (Expand Search)
processes optimization » process optimization (Expand Search), process optimisation (Expand Search), property optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
image processes » damage processes (Expand Search), image processing (Expand Search), change processes (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
a model » _ model (Expand Search)
-
81
Results of Adaboost.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
82
Results of Random Forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
83
Before upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
84
Results of gradient boosting classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
85
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
86
Adaboost classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
87
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
88
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
89
Feature selection process.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
90
Results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
91
After upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
92
Results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
93
Gradient boosting classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
94
-
95
-
96
ROC curves for the test set of four models.
Published 2025“…</p><p>Objective</p><p>This study aimed to develop a risk prediction model for CI in CKD patients using machine learning algorithms, with the objective of enhancing risk prediction accuracy and facilitating early intervention.…”
-
97
The AD-PSO-Guided WOA LSTM framework.
Published 2025“…Out of all the models, LSTM produced the best results. The AD-PSO-Guided WOA algorithm was used to adjust the hyperparameters for the LSTM model. …”
-
98
-
99
Classification baseline performance.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
100
Feature selection results.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”