بدائل البحث:
process optimization » robust optimization (توسيع البحث), model optimization (توسيع البحث), policy optimization (توسيع البحث)
phase process » whole process (توسيع البحث)
binary phase » binary image (توسيع البحث), binary mask (توسيع البحث)
process optimization » robust optimization (توسيع البحث), model optimization (توسيع البحث), policy optimization (توسيع البحث)
phase process » whole process (توسيع البحث)
binary phase » binary image (توسيع البحث), binary mask (توسيع البحث)
-
1
-
2
-
3
Proposed architecture testing phase.
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
-
4
Classification performance after optimization.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
5
ANOVA test for optimization results.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
6
Wilcoxon test results for optimization.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
7
Wilcoxon test results for feature selection.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
8
Feature selection metrics and their definitions.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
9
Statistical summary of all models.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
10
Feature selection results.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
11
ANOVA test for feature selection.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
12
Classification performance of ML and DL models.
منشور في 2025"…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
-
13
-
14
-
15
Comparison with existing SOTA techniques.
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
-
16
Proposed inverted residual parallel block.
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
-
17
Inverted residual bottleneck block.
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
-
18
Sample classes from the HMDB51 dataset.
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
-
19
Sample classes from UCF101 dataset [40].
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
-
20
Self-attention module for the features learning.
منشور في 2025"…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"