Search alternatives:
design optimization » bayesian optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary model » final model (Expand Search), injury model (Expand Search), tiny model (Expand Search)
model design » model designed (Expand Search), novel design (Expand Search), modular design (Expand Search)
image model » damage model (Expand Search), primate model (Expand Search), climate model (Expand Search)
design optimization » bayesian optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary model » final model (Expand Search), injury model (Expand Search), tiny model (Expand Search)
model design » model designed (Expand Search), novel design (Expand Search), modular design (Expand Search)
image model » damage model (Expand Search), primate model (Expand Search), climate model (Expand Search)
-
61
-
62
Summary of existing CNN models.
Published 2024“…To achieve this, we focused the study on addressing the challenge of image noise, which impacts the performance of deep learning models. …”
-
63
Classification baseline performance.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
64
Feature selection results.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
65
ANOVA test result.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
66
Summary of literature review.
Published 2025“…To overcome these limitations, this study introduces a comprehensive deep learning framework enhanced with the innovative bio-inspired Ocotillo Optimization Algorithm (OcOA), designed to improve the accuracy and efficiency of bone marrow cell classification. …”
-
67
Results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
68
Comparison of key techniques in their literature.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
69
SHAP analysis mean value.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
70
Proposed methodology.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
71
SHAP analysis.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
72
Dataset description.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
73
Results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
74
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
75
Results of Adaboost.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
76
Results of Random Forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
77
Before upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
78
Results of gradient boosting classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
79
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
80
Adaboost classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”