Search alternatives:
based optimization » whale optimization (Expand Search)
wolf optimization » whale optimization (Expand Search), swarm optimization (Expand Search), _ optimization (Expand Search)
binary sample » final sample (Expand Search), binary people (Expand Search), intra sample (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
sample based » samples based (Expand Search), scale based (Expand Search)
based wolf » based whole (Expand Search), based work (Expand Search), based well (Expand Search)
based optimization » whale optimization (Expand Search)
wolf optimization » whale optimization (Expand Search), swarm optimization (Expand Search), _ optimization (Expand Search)
binary sample » final sample (Expand Search), binary people (Expand Search), intra sample (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
sample based » samples based (Expand Search), scale based (Expand Search)
based wolf » based whole (Expand Search), based work (Expand Search), based well (Expand Search)
-
41
Feature selection process.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
42
Results of KNN.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
43
After upsampling.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
44
Results of Extra tree.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
45
Gradient boosting classifier results.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
46
-
47
Identification and quantitation of clinically relevant microbes in patient samples: Comparison of three k-mer based classifiers for speed, accuracy, and sensitivity
Published 2019“…We tested the accuracy, sensitivity, and resource requirements of three top metagenomic taxonomic classifiers that use fast k-mer based algorithms: Centrifuge, CLARK, and KrakenUniq. …”
-
48
-
49
-
50
Triplet Matching for Estimating Causal Effects With Three Treatment Arms: A Comparative Study of Mortality by Trauma Center Level
Published 2021“…Our algorithm outperforms the nearest neighbor algorithm and is shown to produce matched samples with total distance no larger than twice the optimal distance. …”
-
51
-
52
-
53
Testing results for classifying AD, MCI and NC.
Published 2024“…Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
-
54
Summary of existing CNN models.
Published 2024“…Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
-
55
Parameter settings.
Published 2024“…<div><p>Differential Evolution (DE) is widely recognized as a highly effective evolutionary algorithm for global optimization. It has proven its efficacy in tackling diverse problems across various fields and real-world applications. …”
-
56
-
57
Data_Sheet_1_A real-time driver fatigue identification method based on GA-GRNN.ZIP
Published 2022“…In this paper, a non-invasive and low-cost method of fatigue driving state identification based on genetic algorithm optimization of generalized regression neural network model is proposed. …”
-
58
Table_1_An efficient decision support system for leukemia identification utilizing nature-inspired deep feature optimization.pdf
Published 2024“…Next, a hybrid feature extraction approach is presented leveraging transfer learning from selected deep neural network models, InceptionV3 and DenseNet201, to extract comprehensive feature sets. To optimize feature selection, a customized binary Grey Wolf Algorithm is utilized, achieving an impressive 80% reduction in feature size while preserving key discriminative information. …”
-
59
-
60
Supplementary file 1_Comparative evaluation of fast-learning classification algorithms for urban forest tree species identification using EO-1 hyperion hyperspectral imagery.docx
Published 2025“…</p>Methods<p>Thirteen supervised classification algorithms were comparatively evaluated, encompassing traditional spectral/statistical classifiers—Maximum Likelihood, Mahalanobis Distance, Minimum Distance, Parallelepiped, Spectral Angle Mapper (SAM), Spectral Information Divergence (SID), and Binary Encoding—and machine learning algorithms including Decision Tree (DT), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN). …”