Search alternatives:
processing optimization » process optimization (Expand Search), process optimisation (Expand Search), routing optimization (Expand Search)
wolf optimization » whale optimization (Expand Search), swarm optimization (Expand Search), _ optimization (Expand Search)
sample processing » image processing (Expand Search), waste processing (Expand Search), pre processing (Expand Search)
final sample » fecal samples (Expand Search), total sample (Expand Search)
processing optimization » process optimization (Expand Search), process optimisation (Expand Search), routing optimization (Expand Search)
wolf optimization » whale optimization (Expand Search), swarm optimization (Expand Search), _ optimization (Expand Search)
sample processing » image processing (Expand Search), waste processing (Expand Search), pre processing (Expand Search)
final sample » fecal samples (Expand Search), total sample (Expand Search)
-
1
Optimized process of the random forest algorithm.
Published 2023“…Then, a training set is randomly selected from known coal mine samples, and the training sample set is processed and analyzed using Matlab software. …”
-
2
The ANFIS algorithm details.
Published 2025“…Finally, in the FGP stage, optimization and purchase amount of each share was done. …”
-
3
-
4
The Search process of the genetic algorithm.
Published 2024“…Firstly, the dataset was balanced using various sampling methods; secondly, a Stacking model based on GA-XGBoost (XGBoost model optimized by genetic algorithm) was constructed for the risk prediction of diabetes; finally, the interpretability of the model was deeply analyzed using Shapley values. …”
-
5
-
6
-
7
-
8
-
9
-
10
-
11
-
12
-
13
Melanoma Skin Cancer Detection Using Deep Learning Methods and Binary GWO Algorithm
Published 2025“…In this work, we propose a novel framework that integrates </p><p dir="ltr">Convolutional Neural Networks (CNNs) for image classification and a binary Grey Wolf Optimization (GWO) </p><p dir="ltr">algorithm for feature selection. …”
-
14
-
15
Algorithm for generating hyperparameter.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
16
Construction process of RF.
Published 2025“…Finally, an improved RF model is constructed by optimizing the parameters of the RF algorithm. …”
-
17
Results of machine learning algorithm.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
18
Genetic algorithm flowchart.
Published 2024“…Firstly, the dataset was balanced using various sampling methods; secondly, a Stacking model based on GA-XGBoost (XGBoost model optimized by genetic algorithm) was constructed for the risk prediction of diabetes; finally, the interpretability of the model was deeply analyzed using Shapley values. …”
-
19
Improved random forest algorithm.
Published 2025“…Additionally, considering the imbalanced in population spatial distribution, we used the K-means ++ clustering algorithm to cluster the optimal feature subset, and we used the bootstrap sampling method to extract the same amount of data from each cluster and fuse it with the training subset to build an improved random forest model. …”
-
20
K-means++ clustering algorithm.
Published 2025“…Additionally, considering the imbalanced in population spatial distribution, we used the K-means ++ clustering algorithm to cluster the optimal feature subset, and we used the bootstrap sampling method to extract the same amount of data from each cluster and fuse it with the training subset to build an improved random forest model. …”