Search alternatives:
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
well optimization » wolf optimization (Expand Search), whale optimization (Expand Search), field optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based well » based cell (Expand Search), based web (Expand Search), based all (Expand Search)
can model » cgan model (Expand Search), cnn model (Expand Search), chain model (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
well optimization » wolf optimization (Expand Search), whale optimization (Expand Search), field optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based well » based cell (Expand Search), based web (Expand Search), based all (Expand Search)
can model » cgan model (Expand Search), cnn model (Expand Search), chain model (Expand Search)
-
1
-
2
DE algorithm flow.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
3
ROC curves for the test set of four models.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
4
Test results of different algorithms.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
5
SHAP bar plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
6
Sample screening flowchart.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
7
Descriptive statistics for variables.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
8
SHAP summary plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
9
Display of the web prediction interface.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
10
-
11
-
12
Algorithm for generating hyperparameter.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
13
Results of machine learning algorithm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
14
-
15
ROC comparison of machine learning algorithm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
16
Parameter settings of the comparison algorithms.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
17
-
18
DataSheet_1_Raman Spectroscopic Differentiation of Streptococcus pneumoniae From Other Streptococci Using Laboratory Strains and Clinical Isolates.pdf
Published 2022“…Improvement of the classification rate is expected with optimized model parameters and algorithms as well as with a larger spectral data base for training.…”
-
19
Best optimizer results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
20
Best optimizer results of Adaboost.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”