Search alternatives:
process optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
phase process » phase proteins (Expand Search), whole process (Expand Search), phase protein (Expand Search)
binary arts » binary pairs (Expand Search)
arts based » areas based (Expand Search)
process optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
phase process » phase proteins (Expand Search), whole process (Expand Search), phase protein (Expand Search)
binary arts » binary pairs (Expand Search)
arts based » areas based (Expand Search)
-
21
-
22
Comparison with existing SOTA techniques.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
23
Proposed inverted residual parallel block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
24
Inverted residual bottleneck block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
25
Sample classes from the HMDB51 dataset.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
26
Sample classes from UCF101 dataset [40].
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
27
Self-attention module for the features learning.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
28
Residual behavior.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
29
Overall framework diagram.
Published 2025“…Secondly, addressing the issue of weight and threshold initialization in BPNN, the Coati Optimization Algorithm (COA) was employed to optimize the network (COA-BPNN). …”
-
30
Analysis and design of algorithms for the manufacturing process of integrated circuits
Published 2023“…The (approximate) solution proposals of state-of-the-art methods include rule-based approaches, genetic algorithms, and reinforcement learning. …”
-
31
-
32
-
33
-
34
Table_1_Optimal Reopening Pathways With COVID-19 Vaccine Rollout and Emerging Variants of Concern.pdf
Published 2021“…Our model framework and optimization strategies take into account the likely range of social contacts during different phases of a gradual reopening process and consider the uncertainties of these contact rates due to variations of individual behaviors and compliance. …”
-
35
datasheet1_Graph Neural Networks for Maximum Constraint Satisfaction.pdf
Published 2021“…Despite being generic, we show that our approach matches or surpasses most greedy and semi-definite programming based algorithms and sometimes even outperforms state-of-the-art heuristics for the specific problems.…”
-
36
GSE96058 information.
Published 2024“…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
-
37
The performance of classifiers.
Published 2024“…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
-
38
Table2_Nonintrusive Load Monitoring Method Based on Color Encoding and Improved Twin Support Vector Machine.XLS
Published 2022“…Second, the two-dimension Gabor wavelet is used to extract the texture features of the image, and the dimension is reduced by means of local linear embedding (LLE). Finally, the artificial fish swarm algorithm (AFSA) is used to optimize the twin support vector machine (TWSVM), and the ITWSM is used to train the load recognition model, which greatly enhances the model training speed. …”
-
39
Table1_Nonintrusive Load Monitoring Method Based on Color Encoding and Improved Twin Support Vector Machine.XLS
Published 2022“…Second, the two-dimension Gabor wavelet is used to extract the texture features of the image, and the dimension is reduced by means of local linear embedding (LLE). Finally, the artificial fish swarm algorithm (AFSA) is used to optimize the twin support vector machine (TWSVM), and the ITWSM is used to train the load recognition model, which greatly enhances the model training speed. …”
-
40