Search alternatives:
well optimization » wolf optimization (Expand Search), whale optimization (Expand Search), field optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based well » based cell (Expand Search), based web (Expand Search), based all (Expand Search)
well optimization » wolf optimization (Expand Search), whale optimization (Expand Search), field optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based well » based cell (Expand Search), based web (Expand Search), based all (Expand Search)
-
61
IRBMO vs. feature selection algorithm boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
62
DataSheet_1_Raman Spectroscopic Differentiation of Streptococcus pneumoniae From Other Streptococci Using Laboratory Strains and Clinical Isolates.pdf
Published 2022“…Improvement of the classification rate is expected with optimized model parameters and algorithms as well as with a larger spectral data base for training.…”
-
63
-
64
Comparison in terms of the sensitivity.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
65
Parameter sensitivity of BIMGO.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
66
Details of the medical datasets.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
67
The flowchart of IMGO.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
68
Comparison in terms of the selected features.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
69
Iterative chart of control factor.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
70
Details of 23 basic benchmark functions.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
71
Related researches.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
72
S1 Dataset -
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
73
SHAP bar plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
74
Sample screening flowchart.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
75
Descriptive statistics for variables.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
76
SHAP summary plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
77
ROC curves for the test set of four models.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
78
Display of the web prediction interface.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
79
Pseudo Code of RBMO.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
80
P-value on CEC-2017(Dim = 30).
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”