Showing 1 - 20 results of 22 for search '(( binary based whole optimization algorithm ) OR ( final layer swarm optimization algorithm ))', query time: 0.43s Refine Results
  1. 1

    Particle swarm optimization algorithm flowchart. by Zhengyu Xu (8550660)

    Published 2025
    “…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
  2. 2

    Firefly optimization algorithm flowchart. by Zhengyu Xu (8550660)

    Published 2025
    “…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
  3. 3

    The details of the test algorithm. by Yule Sun (16015342)

    Published 2023
    “…<div><p>A deep memory bare-bones particle swarm optimization algorithm (DMBBPSO) for single-objective optimization problems is proposed in this paper. …”
  4. 4

    The primeval multi-channel map of the TEM method. by Zhengyu Xu (8550660)

    Published 2025
    “…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
  5. 5

    S1 Data - by Zhengyu Xu (8550660)

    Published 2025
    “…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
  6. 6

    The image of the Aekley function. by Zhengyu Xu (8550660)

    Published 2025
    “…And finally, an improved PSO-IFA hybrid optimization algorithm (PSO-IFAH) was proposed in the paper. …”
  7. 7

    The details of the control group. by Yule Sun (16015342)

    Published 2023
    “…<div><p>A deep memory bare-bones particle swarm optimization algorithm (DMBBPSO) for single-objective optimization problems is proposed in this paper. …”
  8. 8

    The flowchart of DMBBPSO. by Yule Sun (16015342)

    Published 2023
    “…<div><p>A deep memory bare-bones particle swarm optimization algorithm (DMBBPSO) for single-objective optimization problems is proposed in this paper. …”
  9. 9
  10. 10

    Comparison with existing SOTA techniques. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  11. 11

    Proposed inverted residual parallel block. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  12. 12

    Inverted residual bottleneck block. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  13. 13

    Proposed architecture testing phase. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  14. 14

    Sample classes from the HMDB51 dataset. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  15. 15

    Sample classes from UCF101 dataset [40]. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  16. 16

    Self-attention module for the features learning. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  17. 17

    Residual behavior. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  18. 18

    <i>hi</i>PRS algorithm process flow. by Michela C. Massi (14599915)

    Published 2023
    “…The sequences can include from a single SNP-allele pair up to a maximum number of pairs defined by the user (<i>l</i><sub>max</sub>). <b>(C)</b> The whole training data is then scanned, searching for these sequences and deriving a re-encoded dataset where interaction terms are binary features (i.e., 1 if sequence <i>i</i> is observed in <i>j</i>-th patient genotype, 0 otherwise). …”
  19. 19

    Image_2_A two-stage hybrid gene selection algorithm combined with machine learning models to predict the rupture status in intracranial aneurysms.TIF by Qingqing Li (1505614)

    Published 2022
    “…First, we used the Fast Correlation-Based Filter (FCBF) algorithm to filter a large number of irrelevant and redundant genes in the raw dataset, and then used the wrapper feature selection method based on the he Multi-layer Perceptron (MLP) neural network and the Particle Swarm Optimization (PSO), accuracy (ACC) and mean square error (MSE) were then used as the evaluation criteria. …”
  20. 20

    Image_1_A two-stage hybrid gene selection algorithm combined with machine learning models to predict the rupture status in intracranial aneurysms.TIF by Qingqing Li (1505614)

    Published 2022
    “…First, we used the Fast Correlation-Based Filter (FCBF) algorithm to filter a large number of irrelevant and redundant genes in the raw dataset, and then used the wrapper feature selection method based on the he Multi-layer Perceptron (MLP) neural network and the Particle Swarm Optimization (PSO), accuracy (ACC) and mean square error (MSE) were then used as the evaluation criteria. …”