Search alternatives:
learning optimization » learning motivation (Expand Search), lead optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
phase learning » based learning (Expand Search), face learning (Expand Search), aware learning (Expand Search)
binary phase » binary image (Expand Search), final phase (Expand Search)
binary base » binary mask (Expand Search), ciliary base (Expand Search), binary image (Expand Search)
learning optimization » learning motivation (Expand Search), lead optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
phase learning » based learning (Expand Search), face learning (Expand Search), aware learning (Expand Search)
binary phase » binary image (Expand Search), final phase (Expand Search)
binary base » binary mask (Expand Search), ciliary base (Expand Search), binary image (Expand Search)
-
1
MSE for ILSTM algorithm in binary classification.
Published 2023“…The ILSTM was then used to build an efficient intrusion detection system for binary and multi-class classification cases. The proposed algorithm has two phases: phase one involves training a conventional LSTM network to get initial weights, and phase two involves using the hybrid swarm algorithms, CBOA and PSO, to optimize the weights of LSTM to improve the accuracy. …”
-
2
-
3
-
4
-
5
-
6
-
7
-
8
-
9
-
10
-
11
-
12
-
13
-
14
-
15
Classification performance after optimization.
Published 2025“…We applied this hybrid strategy to a Radial Basis Function Network (RBFN), and validated its performance improvements through extensive experiments, including ANOVA and Wilcoxon tests for both feature selection and optimization phases. The optimized model achieved a classification accuracy of 99.46%, significantly outperforming classical machine learning and unoptimized deep learning models. …”
-
16
ANOVA test for optimization results.
Published 2025“…We applied this hybrid strategy to a Radial Basis Function Network (RBFN), and validated its performance improvements through extensive experiments, including ANOVA and Wilcoxon tests for both feature selection and optimization phases. The optimized model achieved a classification accuracy of 99.46%, significantly outperforming classical machine learning and unoptimized deep learning models. …”
-
17
Wilcoxon test results for optimization.
Published 2025“…We applied this hybrid strategy to a Radial Basis Function Network (RBFN), and validated its performance improvements through extensive experiments, including ANOVA and Wilcoxon tests for both feature selection and optimization phases. The optimized model achieved a classification accuracy of 99.46%, significantly outperforming classical machine learning and unoptimized deep learning models. …”
-
18
Wilcoxon test results for feature selection.
Published 2025“…We applied this hybrid strategy to a Radial Basis Function Network (RBFN), and validated its performance improvements through extensive experiments, including ANOVA and Wilcoxon tests for both feature selection and optimization phases. The optimized model achieved a classification accuracy of 99.46%, significantly outperforming classical machine learning and unoptimized deep learning models. …”
-
19
Feature selection metrics and their definitions.
Published 2025“…We applied this hybrid strategy to a Radial Basis Function Network (RBFN), and validated its performance improvements through extensive experiments, including ANOVA and Wilcoxon tests for both feature selection and optimization phases. The optimized model achieved a classification accuracy of 99.46%, significantly outperforming classical machine learning and unoptimized deep learning models. …”
-
20
Statistical summary of all models.
Published 2025“…We applied this hybrid strategy to a Radial Basis Function Network (RBFN), and validated its performance improvements through extensive experiments, including ANOVA and Wilcoxon tests for both feature selection and optimization phases. The optimized model achieved a classification accuracy of 99.46%, significantly outperforming classical machine learning and unoptimized deep learning models. …”