Search alternatives:
codon optimization » wolf optimization (Expand Search)
all optimization » art optimization (Expand Search), ai optimization (Expand Search), whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
lines based » lens based (Expand Search), genes based (Expand Search), lines used (Expand Search)
based codon » based color (Expand Search), based cohort (Expand Search), based action (Expand Search)
based all » based small (Expand Search), based cell (Expand Search), based ap (Expand Search)
codon optimization » wolf optimization (Expand Search)
all optimization » art optimization (Expand Search), ai optimization (Expand Search), whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
lines based » lens based (Expand Search), genes based (Expand Search), lines used (Expand Search)
based codon » based color (Expand Search), based cohort (Expand Search), based action (Expand Search)
based all » based small (Expand Search), based cell (Expand Search), based ap (Expand Search)
-
41
Feature selection process.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
42
Results of KNN.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
43
After upsampling.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
44
Results of Extra tree.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
45
Gradient boosting classifier results.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
46
Plan frame of the house.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
47
Ablation test results.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
48
Hyperparameter selection test.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
49
Multiple index test results of different methods.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
50
Backtracking strategy diagram.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
51
Comparison of differences in literature methods.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
52
New building interior space layout model flow.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
53
Schematic of iteration process of IDE-IIGA.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
54
Schematic diagram of IGA chromosome coding.
Published 2025“…In the experiments, optimization metrics such as kinematic optimization rate (calculated based on the shortest path and connectivity between functional areas), space utilization rate (calculated by the ratio of room area to total usable space), and functional fitness (based on the weighted sum of users’ subjective evaluations and functional matches) all perform well. …”
-
55
SHAP bar plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
56
Sample screening flowchart.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
57
Descriptive statistics for variables.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
58
SHAP summary plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
59
ROC curves for the test set of four models.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
60
Display of the web prediction interface.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”