Search alternatives:
spatial optimization » spatial organization (Expand Search), path optimization (Expand Search), swarm optimization (Expand Search)
all optimization » art optimization (Expand Search), ai optimization (Expand Search), whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based all » based small (Expand Search), based cell (Expand Search), based ap (Expand Search)
spatial optimization » spatial organization (Expand Search), path optimization (Expand Search), swarm optimization (Expand Search)
all optimization » art optimization (Expand Search), ai optimization (Expand Search), whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based all » based small (Expand Search), based cell (Expand Search), based ap (Expand Search)
-
41
Results of Random Forest.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
42
Before upsampling.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
43
Results of gradient boosting classifier.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
44
Results of Decision tree.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
45
Adaboost classifier results.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
46
Results of Lightbgm.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
47
Results of Lightbgm.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
48
Feature selection process.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
49
Results of KNN.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
50
After upsampling.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
51
Results of Extra tree.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
52
Gradient boosting classifier results.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
53
Data_Sheet_1_Multiclass Classification Based on Combined Motor Imageries.pdf
Published 2020“…And we propose two new multilabel uses of the Common Spatial Pattern (CSP) algorithm to optimize the signal-to-noise ratio, namely MC2CMI and MC2SMI approaches. …”
-
54
SHAP bar plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
55
Sample screening flowchart.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
56
Descriptive statistics for variables.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
57
SHAP summary plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
58
ROC curves for the test set of four models.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
59
Display of the web prediction interface.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
60
A new fast filtering algorithm for a 3D point cloud based on RGB-D information
Published 2019“…This method aligns the color image to the depth image, and the color mapping image is converted to an HSV image. Then, the optimal segmentation threshold of the V image that is calculated by using the Otsu algorithm is applied to segment the color mapping image into a binary image, which is used to extract the valid point cloud from the original point cloud with outliers. …”