Search alternatives:
complex optimization » convex optimization (Expand Search), whale optimization (Expand Search), wolf optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
based complex » layer complex (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
lens » less (Expand Search)
complex optimization » convex optimization (Expand Search), whale optimization (Expand Search), wolf optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
based complex » layer complex (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
lens » less (Expand Search)
-
21
Best optimizer results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
22
Best optimizer results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
23
Ensemble model architecture.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
24
QSAR model for predicting neuraminidase inhibitors of influenza A viruses (H1N1) based on adaptive grasshopper optimization algorithm
Published 2020“…The binary grasshopper optimization algorithm (BGOA) is a new meta-heuristic optimization algorithm, which has been used successfully to perform feature selection. …”
-
25
Comparison table of the proposed model.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
26
Confusion matrix of ensemble model.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
27
Best optimizer results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
28
Best optimizer result for Adaboost classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
29
Best optimizer results for random forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
30
Best optimizer results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
31
Best optimizer results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
32
Best optimizer results of Random Forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
33
Best optimizer result for Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
34
-
35
-
36
-
37
Effects of Class Imbalance and Data Scarcity on the Performance of Binary Classification Machine Learning Models Developed Based on ToxCast/Tox21 Assay Data
Published 2022“…Therefore, the resampling algorithm employed should vary depending on the data distribution to achieve optimal classification performance. …”
-
38
-value on CEC2022 (dim = 20).
Published 2025“…The algorithm integrates three key strategies: a precise population elimination strategy, which optimizes the population structure by eliminating individuals with low fitness and intelligently generating new ones; a lens imaging-based opposition learning strategy, which expands the exploration of the solution space through reflection and scaling to reduce the risk of local optima; and a boundary control strategy based on the best individual, which effectively constrains the search range to avoid inefficient searches and premature convergence. …”
-
39
Precision elimination strategy.
Published 2025“…The algorithm integrates three key strategies: a precise population elimination strategy, which optimizes the population structure by eliminating individuals with low fitness and intelligently generating new ones; a lens imaging-based opposition learning strategy, which expands the exploration of the solution space through reflection and scaling to reduce the risk of local optima; and a boundary control strategy based on the best individual, which effectively constrains the search range to avoid inefficient searches and premature convergence. …”
-
40
Results of low-light image enhancement test.
Published 2025“…The algorithm integrates three key strategies: a precise population elimination strategy, which optimizes the population structure by eliminating individuals with low fitness and intelligently generating new ones; a lens imaging-based opposition learning strategy, which expands the exploration of the solution space through reflection and scaling to reduce the risk of local optima; and a boundary control strategy based on the best individual, which effectively constrains the search range to avoid inefficient searches and premature convergence. …”