Search alternatives:
feature optimization » resource optimization (Expand Search), feature elimination (Expand Search), structure optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
binary base » binary mask (Expand Search), ciliary base (Expand Search), binary image (Expand Search)
a feature » _ feature (Expand Search), _ features (Expand Search), each feature (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
feature optimization » resource optimization (Expand Search), feature elimination (Expand Search), structure optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
binary base » binary mask (Expand Search), ciliary base (Expand Search), binary image (Expand Search)
a feature » _ feature (Expand Search), _ features (Expand Search), each feature (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
-
81
QSAR model for predicting neuraminidase inhibitors of influenza A viruses (H1N1) based on adaptive grasshopper optimization algorithm
Published 2020“…The binary grasshopper optimization algorithm (BGOA) is a new meta-heuristic optimization algorithm, which has been used successfully to perform feature selection. …”
-
82
Hyperparameters of the LSTM Model.
Published 2025“…This effectively balances exploration and exploitation, and addresses the early convergence problem of the original algorithms. To choose the most crucial characteristics of the dataset, the feature selection method employs the binary format of AD-PSO-Guided WOA. …”
-
83
The AD-PSO-Guided WOA LSTM framework.
Published 2025“…This effectively balances exploration and exploitation, and addresses the early convergence problem of the original algorithms. To choose the most crucial characteristics of the dataset, the feature selection method employs the binary format of AD-PSO-Guided WOA. …”
-
84
Prediction results of individual models.
Published 2025“…This effectively balances exploration and exploitation, and addresses the early convergence problem of the original algorithms. To choose the most crucial characteristics of the dataset, the feature selection method employs the binary format of AD-PSO-Guided WOA. …”
-
85
Table_1_bSRWPSO-FKNN: A boosted PSO with fuzzy K-nearest neighbor classifier for predicting atopic dermatitis disease.docx
Published 2023“…</p>Methods<p>This paper establishes a medical prediction model for the first time on the basis of the enhanced particle swarm optimization (SRWPSO) algorithm and the fuzzy K-nearest neighbor (FKNN), called bSRWPSO-FKNN, which is practiced on a dataset related to patients with AD. …”
-
86
-
87
Generalized Tensor Decomposition With Features on Multiple Modes
Published 2021“…Unlike unsupervised tensor decomposition, our supervised decomposition captures the effective dimension reduction of the data tensor confined to feature space of interest. An efficient alternating optimization algorithm with provable spectral initialization is further developed. …”
-
88
Statistical summary of all models.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
89
Classification performance of ML and DL models.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
90
-
91
Pseudo Code of RBMO.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
92
P-value on CEC-2017(Dim = 30).
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
93
Memory storage behavior.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
94
Elite search behavior.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
95
Description of the datasets.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
96
S and V shaped transfer functions.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
97
S- and V-Type transfer function diagrams.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
98
Collaborative hunting behavior.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
99
Friedman average rank sum test results.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
100
IRBMO vs. variant comparison adaptation data.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”