Search alternatives:
design optimization » bayesian optimization (Expand Search)
based optimization » whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary from » diary from (Expand Search), library from (Expand Search)
from based » from case (Expand Search), form based (Expand Search), from laser (Expand Search)
design optimization » bayesian optimization (Expand Search)
based optimization » whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary from » diary from (Expand Search), library from (Expand Search)
from based » from case (Expand Search), form based (Expand Search), from laser (Expand Search)
-
81
Before upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
82
Results of gradient boosting classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
83
Results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
84
Adaboost classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
85
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
86
Results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
87
Feature selection process.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
88
Results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
89
After upsampling.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
90
Results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
91
Gradient boosting classifier results.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
92
Data_Sheet_1_A Global Optimizer for Nanoclusters.PDF
Published 2019“…This method is implemented in PyAR (https://github.com/anooplab/pyar) program. The global optimization in PyAR involves two parts, generation of several trial geometries and gradient-based local optimization of the trial geometries. …”
-
93
the functioning of BRPSO.
Published 2025“…A sensitivity analysis of key RFD parameters, including frictional moment and rigid beam length, highlights their influence on seismic performance. The optimization problem is formulated based on the seismic energy dissipation concept, employing a modified binary and real-coded particle swarm optimization (BRPSO) algorithm. …”
-
94
Characteristic of 6- and 10-story SMRF [99,98].
Published 2025“…A sensitivity analysis of key RFD parameters, including frictional moment and rigid beam length, highlights their influence on seismic performance. The optimization problem is formulated based on the seismic energy dissipation concept, employing a modified binary and real-coded particle swarm optimization (BRPSO) algorithm. …”
-
95
The RFD’s behavior mechanism (2002).
Published 2025“…A sensitivity analysis of key RFD parameters, including frictional moment and rigid beam length, highlights their influence on seismic performance. The optimization problem is formulated based on the seismic energy dissipation concept, employing a modified binary and real-coded particle swarm optimization (BRPSO) algorithm. …”
-
96
Data_Sheet_1_Physics-Inspired Optimization for Quadratic Unconstrained Problems Using a Digital Annealer.pdf
Published 2019“…<p>The Fujitsu Digital Annealer is designed to solve fully connected quadratic unconstrained binary optimization (QUBO) problems. …”
-
97
<i>hi</i>PRS algorithm process flow.
Published 2023“…From this dataset we can compute the MI between each interaction and the outcome and <b>(D)</b> obtain a ranked list (<i>I</i><sub><i>δ</i></sub>) based on this metric. …”
-
98
Summary of LITNET-2020 dataset.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
99
SHAP analysis for LITNET-2020 dataset.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
100
Comparison of intrusion detection systems.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”