بدائل البحث:
model optimization » codon optimization (توسيع البحث), global optimization (توسيع البحث), based optimization (توسيع البحث)
phase process » phase proteins (توسيع البحث), whole process (توسيع البحث), phase protein (توسيع البحث)
binary base » binary mask (توسيع البحث), ciliary base (توسيع البحث), binary image (توسيع البحث)
base model » based model (توسيع البحث), based models (توسيع البحث), game model (توسيع البحث)
model optimization » codon optimization (توسيع البحث), global optimization (توسيع البحث), based optimization (توسيع البحث)
phase process » phase proteins (توسيع البحث), whole process (توسيع البحث), phase protein (توسيع البحث)
binary base » binary mask (توسيع البحث), ciliary base (توسيع البحث), binary image (توسيع البحث)
base model » based model (توسيع البحث), based models (توسيع البحث), game model (توسيع البحث)
-
61
Before upsampling.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
62
Results of gradient boosting classifier.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
63
Results of Decision tree.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
64
Adaboost classifier results.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
65
Results of Lightbgm.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
66
Results of Lightbgm.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
67
Feature selection process.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
68
Results of KNN.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
69
After upsampling.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
70
Results of Extra tree.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
71
Gradient boosting classifier results.
منشور في 2024"…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
-
72
-
73
-
74
-
75
The Pseudo-Code of the IRBMO Algorithm.
منشور في 2025"…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …"
-
76
<i>hi</i>PRS algorithm process flow.
منشور في 2023"…From this dataset we can compute the MI between each interaction and the outcome and <b>(D)</b> obtain a ranked list (<i>I</i><sub><i>δ</i></sub>) based on this metric. <b>(E)</b> Starting from the interaction at the top of <i>I</i><sub><i>δ</i></sub>, <i>hi</i>PRS constructs <i>I</i><sub><i>K</i></sub>, selecting <i>K</i> (where <i>K</i> is user-specified) terms through the greedy optimization of the ratio between MI (<i>relevance</i>) and a suitable measure of similarity for interactions (<i>redundancy)</i> (cf. …"
-
77
IRBMO vs. meta-heuristic algorithms boxplot.
منشور في 2025"…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …"
-
78
IRBMO vs. feature selection algorithm boxplot.
منشور في 2025"…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …"
-
79
Flowchart scheme of the ML-based model.
منشور في 2024"…<b>I)</b> Testing data consisting of 20% of the entire dataset. <b>J)</b> Optimization of hyperparameter tuning. <b>K)</b> Algorithm selection from all models. …"
-
80