Search alternatives:
based optimization » whale optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), wolf optimization (Expand Search)
library based » laboratory based (Expand Search)
binary based » linac based (Expand Search), binary mask (Expand Search)
based based » based case (Expand Search), based basis (Expand Search), ranked based (Expand Search)
based optimization » whale optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), wolf optimization (Expand Search)
library based » laboratory based (Expand Search)
binary based » linac based (Expand Search), binary mask (Expand Search)
based based » based case (Expand Search), based basis (Expand Search), ranked based (Expand Search)
-
101
Results of gradient boosting classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
102
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
103
Adaboost classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
104
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
105
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
106
Feature selection process.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
107
Results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
108
After upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
109
Results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
110
Gradient boosting classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
111
ROC curve for binary classification.
Published 2024“…The model further showed superior results on binary classification compared with existing methods. …”
-
112
Confusion matrix for binary classification.
Published 2024“…The model further showed superior results on binary classification compared with existing methods. …”
-
113
-
114
-
115
-
116
-
117
The Pseudo-Code of the IRBMO Algorithm.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
118
<i>hi</i>PRS algorithm process flow.
Published 2023“…From this dataset we can compute the MI between each interaction and the outcome and <b>(D)</b> obtain a ranked list (<i>I</i><sub><i>δ</i></sub>) based on this metric. <b>(E)</b> Starting from the interaction at the top of <i>I</i><sub><i>δ</i></sub>, <i>hi</i>PRS constructs <i>I</i><sub><i>K</i></sub>, selecting <i>K</i> (where <i>K</i> is user-specified) terms through the greedy optimization of the ratio between MI (<i>relevance</i>) and a suitable measure of similarity for interactions (<i>redundancy)</i> (cf. …”
-
119
Predictive Analysis of Mushroom Toxicity Based Exclusively on Their Natural Habitat.
Published 2025“…Model evaluation was based on accuracy metrics and qualitative analysis of the confusion matrix.. …”
-
120
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”