Search alternatives:
process optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
phase process » phase proteins (Expand Search), whole process (Expand Search), phase protein (Expand Search)
binary phase » binary image (Expand Search), final phase (Expand Search)
values based » value based (Expand Search), values used (Expand Search), values ranged (Expand Search)
process optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
phase process » phase proteins (Expand Search), whole process (Expand Search), phase protein (Expand Search)
binary phase » binary image (Expand Search), final phase (Expand Search)
values based » value based (Expand Search), values used (Expand Search), values ranged (Expand Search)
-
41
Best optimizer results of KNN.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
42
ROC curve for binary classification.
Published 2024“…Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
-
43
Confusion matrix for binary classification.
Published 2024“…Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
-
44
Best optimizer results of Decision tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
45
Best optimizer result for Adaboost classifier.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
46
Best optimizer results for random forest.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
47
Best optimizer results of Decision tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
48
Best optimizer results of Extra tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
49
Best optimizer results of Random Forest.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
50
Best optimizer result for Extra tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
51
Parameter settings of the comparison algorithms.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
52
The Pseudo-Code of the IRBMO Algorithm.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
53
QSAR model for predicting neuraminidase inhibitors of influenza A viruses (H1N1) based on adaptive grasshopper optimization algorithm
Published 2020“…Obtaining a reliable QSAR model with few descriptors is an essential procedure in chemometrics. The binary grasshopper optimization algorithm (BGOA) is a new meta-heuristic optimization algorithm, which has been used successfully to perform feature selection. …”
-
54
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
55
IRBMO vs. feature selection algorithm boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
56
Wilcoxon test results for feature selection.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
57
Feature selection metrics and their definitions.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
58
Statistical summary of all models.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
59
Feature selection results.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
60
ANOVA test for feature selection.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”