Search alternatives:
feature optimization » resource optimization (Expand Search), feature elimination (Expand Search), structure optimization (Expand Search)
process optimization » model optimization (Expand Search)
based process » based processes (Expand Search), based probes (Expand Search), based proteins (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
a feature » _ feature (Expand Search), _ features (Expand Search), each feature (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
feature optimization » resource optimization (Expand Search), feature elimination (Expand Search), structure optimization (Expand Search)
process optimization » model optimization (Expand Search)
based process » based processes (Expand Search), based probes (Expand Search), based proteins (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
a feature » _ feature (Expand Search), _ features (Expand Search), each feature (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
-
41
Best optimizer result for Adaboost classifier.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
42
Best optimizer results for random forest.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
43
Best optimizer results of Decision tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
44
Best optimizer results of Extra tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
45
Best optimizer results of Random Forest.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
46
Best optimizer result for Extra tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
47
Proposed Algorithm.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
48
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
49
Feature selection results.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
50
Feature selection metrics and their definitions.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
51
ANOVA test for feature selection.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
52
Comparisons between ADAM and NADAM optimizers.
Published 2025“…To enhance the offloading decision-making process, the algorithm incorporates the Newton-Raphson method for fast and efficient optimization of the computation rate under energy constraints. …”
-
53
Parameter settings of the comparison algorithms.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
54
Wilcoxon test results for feature selection.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
55
-
56
-
57
-
58
Process flow diagram of CBFD.
Published 2024“…The results demonstrate that CBFD achieves a average precision of 0.97 for the test image, outperforming Superpoint, Directional Intensified Tertiary Filtering (DITF), Binary Robust Independent Elementary Features (BRIEF), Binary Robust Invariant Scalable Keypoints (BRISK), Speeded Up Robust Features (SURF), and Scale Invariant Feature Transform (SIFT), which achieve scores of 0.95, 0.92, 0.72, 0.66, 0.63 and 0.50 respectively. …”
-
59
Comparison in terms of the selected features.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
60
Results of KNN.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”