Search alternatives:
whole optimization » whale optimization (Expand Search), wolf optimization (Expand Search), dose optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based whole » used whole (Expand Search)
final layer » single layer (Expand Search)
swarm » warm (Expand Search)
whole optimization » whale optimization (Expand Search), wolf optimization (Expand Search), dose optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based whole » used whole (Expand Search)
final layer » single layer (Expand Search)
swarm » warm (Expand Search)
-
1
Particle swarm optimization algorithm flowchart.
Published 2025“…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
-
2
Firefly optimization algorithm flowchart.
Published 2025“…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
-
3
The details of the test algorithm.
Published 2023“…<div><p>A deep memory bare-bones particle swarm optimization algorithm (DMBBPSO) for single-objective optimization problems is proposed in this paper. …”
-
4
The primeval multi-channel map of the TEM method.
Published 2025“…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
-
5
S1 Data -
Published 2025“…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
-
6
The image of the Aekley function.
Published 2025“…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
-
7
The details of the control group.
Published 2023“…<div><p>A deep memory bare-bones particle swarm optimization algorithm (DMBBPSO) for single-objective optimization problems is proposed in this paper. …”
-
8
The flowchart of DMBBPSO.
Published 2023“…<div><p>A deep memory bare-bones particle swarm optimization algorithm (DMBBPSO) for single-objective optimization problems is proposed in this paper. …”
-
9
-
10
Comparison with existing SOTA techniques.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
11
Proposed inverted residual parallel block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
12
Inverted residual bottleneck block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
13
Proposed architecture testing phase.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
14
Sample classes from the HMDB51 dataset.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
15
Sample classes from UCF101 dataset [40].
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
16
Self-attention module for the features learning.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
17
Residual behavior.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
18
<i>hi</i>PRS algorithm process flow.
Published 2023“…The sequences can include from a single SNP-allele pair up to a maximum number of pairs defined by the user (<i>l</i><sub>max</sub>). <b>(C)</b> The whole training data is then scanned, searching for these sequences and deriving a re-encoded dataset where interaction terms are binary features (i.e., 1 if sequence <i>i</i> is observed in <i>j</i>-th patient genotype, 0 otherwise). …”
-
19
Image_2_A two-stage hybrid gene selection algorithm combined with machine learning models to predict the rupture status in intracranial aneurysms.TIF
Published 2022“…First, we used the Fast Correlation-Based Filter (FCBF) algorithm to filter a large number of irrelevant and redundant genes in the raw dataset, and then used the wrapper feature selection method based on the he Multi-layer Perceptron (MLP) neural network and the Particle Swarm Optimization (PSO), accuracy (ACC) and mean square error (MSE) were then used as the evaluation criteria. …”
-
20
Image_1_A two-stage hybrid gene selection algorithm combined with machine learning models to predict the rupture status in intracranial aneurysms.TIF
Published 2022“…First, we used the Fast Correlation-Based Filter (FCBF) algorithm to filter a large number of irrelevant and redundant genes in the raw dataset, and then used the wrapper feature selection method based on the he Multi-layer Perceptron (MLP) neural network and the Particle Swarm Optimization (PSO), accuracy (ACC) and mean square error (MSE) were then used as the evaluation criteria. …”