Search alternatives:
feature optimization » resource optimization (Expand Search), feature elimination (Expand Search), structure optimization (Expand Search)
based optimization » whale optimization (Expand Search)
its feature » its features (Expand Search), sts features (Expand Search), omics feature (Expand Search)
tasks based » task based (Expand Search), cases based (Expand Search)
binary its » binary pairs (Expand Search)
feature optimization » resource optimization (Expand Search), feature elimination (Expand Search), structure optimization (Expand Search)
based optimization » whale optimization (Expand Search)
its feature » its features (Expand Search), sts features (Expand Search), omics feature (Expand Search)
tasks based » task based (Expand Search), cases based (Expand Search)
binary its » binary pairs (Expand Search)
-
1
IRBMO vs. feature selection algorithm boxplot.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
2
The Pseudo-Code of the IRBMO Algorithm.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
3
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
4
-
5
Pseudo Code of RBMO.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
6
P-value on CEC-2017(Dim = 30).
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
7
Memory storage behavior.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
8
Elite search behavior.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
9
Description of the datasets.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
10
S and V shaped transfer functions.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
11
S- and V-Type transfer function diagrams.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
12
Collaborative hunting behavior.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
13
Friedman average rank sum test results.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
14
IRBMO vs. variant comparison adaptation data.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
15
Algorithm for generating hyperparameter.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
16
Results of machine learning algorithm.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
17
ROC comparison of machine learning algorithm.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
18
Feature selection process.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
19
Best optimizer results of Lightbgm.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
20
Best optimizer results of Adaboost.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”