Search alternatives:
process optimization » model optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
phase process » phase proteins (Expand Search), whole process (Expand Search), phase protein (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
data codon » data code (Expand Search), data codes (Expand Search), data codings (Expand Search)
process optimization » model optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
phase process » phase proteins (Expand Search), whole process (Expand Search), phase protein (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
data codon » data code (Expand Search), data codes (Expand Search), data codings (Expand Search)
-
21
Proposed inverted residual parallel block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
22
Inverted residual bottleneck block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
23
Sample classes from the HMDB51 dataset.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
24
Sample classes from UCF101 dataset [40].
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
25
Self-attention module for the features learning.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
26
Residual behavior.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
27
Overall framework diagram.
Published 2025“…Secondly, addressing the issue of weight and threshold initialization in BPNN, the Coati Optimization Algorithm (COA) was employed to optimize the network (COA-BPNN). …”
-
28
-
29
-
30
-
31
Table_1_Optimal Reopening Pathways With COVID-19 Vaccine Rollout and Emerging Variants of Concern.pdf
Published 2021“…Our model framework and optimization strategies take into account the likely range of social contacts during different phases of a gradual reopening process and consider the uncertainties of these contact rates due to variations of individual behaviors and compliance. …”
-
32
GSE96058 information.
Published 2024“…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
-
33
The performance of classifiers.
Published 2024“…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
-
34
Table2_Nonintrusive Load Monitoring Method Based on Color Encoding and Improved Twin Support Vector Machine.XLS
Published 2022“…Second, the two-dimension Gabor wavelet is used to extract the texture features of the image, and the dimension is reduced by means of local linear embedding (LLE). Finally, the artificial fish swarm algorithm (AFSA) is used to optimize the twin support vector machine (TWSVM), and the ITWSM is used to train the load recognition model, which greatly enhances the model training speed. …”
-
35
Table1_Nonintrusive Load Monitoring Method Based on Color Encoding and Improved Twin Support Vector Machine.XLS
Published 2022“…Second, the two-dimension Gabor wavelet is used to extract the texture features of the image, and the dimension is reduced by means of local linear embedding (LLE). Finally, the artificial fish swarm algorithm (AFSA) is used to optimize the twin support vector machine (TWSVM), and the ITWSM is used to train the load recognition model, which greatly enhances the model training speed. …”
-
36
-
37
-
38
-
39
-
40