Search alternatives:
design optimization » bayesian optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
design optimization » bayesian optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
-
81
Results of Adaboost.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
82
Results of Random Forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
83
Before upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
84
Results of gradient boosting classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
85
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
86
Adaboost classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
87
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
88
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
89
Feature selection process.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
90
Results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
91
After upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
92
Results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
93
Gradient boosting classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
94
-
95
-
96
the functioning of BRPSO.
Published 2025“…A sensitivity analysis of key RFD parameters, including frictional moment and rigid beam length, highlights their influence on seismic performance. The optimization problem is formulated based on the seismic energy dissipation concept, employing a modified binary and real-coded particle swarm optimization (BRPSO) algorithm. …”
-
97
Characteristic of 6- and 10-story SMRF [99,98].
Published 2025“…A sensitivity analysis of key RFD parameters, including frictional moment and rigid beam length, highlights their influence on seismic performance. The optimization problem is formulated based on the seismic energy dissipation concept, employing a modified binary and real-coded particle swarm optimization (BRPSO) algorithm. …”
-
98
The RFD’s behavior mechanism (2002).
Published 2025“…A sensitivity analysis of key RFD parameters, including frictional moment and rigid beam length, highlights their influence on seismic performance. The optimization problem is formulated based on the seismic energy dissipation concept, employing a modified binary and real-coded particle swarm optimization (BRPSO) algorithm. …”
-
99
Classification baseline performance.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
100
Feature selection results.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”