Search alternatives:
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
active process » entire process (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
a model » _ model (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
active process » entire process (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
a model » _ model (Expand Search)
-
81
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
82
Adaboost classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
83
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
84
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
85
Feature selection process.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
86
Results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
87
After upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
88
Results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
89
Gradient boosting classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
90
-
91
-
92
Confusion matrix.
Published 2025“…<div><p>This paper introduces a groundbreaking monitoring model tailored for sustainable trade activity surveillance, which synergistically integrates event-driven architecture with an intelligent decision tree. …”
-
93
Parameter settings.
Published 2025“…<div><p>This paper introduces a groundbreaking monitoring model tailored for sustainable trade activity surveillance, which synergistically integrates event-driven architecture with an intelligent decision tree. …”
-
94
ROC curves for the test set of four models.
Published 2025“…</p><p>Objective</p><p>This study aimed to develop a risk prediction model for CI in CKD patients using machine learning algorithms, with the objective of enhancing risk prediction accuracy and facilitating early intervention.…”
-
95
The AD-PSO-Guided WOA LSTM framework.
Published 2025“…Out of all the models, LSTM produced the best results. The AD-PSO-Guided WOA algorithm was used to adjust the hyperparameters for the LSTM model. …”
-
96
-
97
Classification baseline performance.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
98
Feature selection results.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
99
ANOVA test result.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
100
Summary of literature review.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”