Showing 1 - 9 results of 9 for search '(( binary base codon optimization algorithm ) OR ( final phase art optimization algorithm ))', query time: 0.39s Refine Results
  1. 1
  2. 2

    Proposed architecture testing phase. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  3. 3

    Comparison with existing SOTA techniques. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  4. 4

    Proposed inverted residual parallel block. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  5. 5

    Inverted residual bottleneck block. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  6. 6

    Sample classes from the HMDB51 dataset. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  7. 7

    Sample classes from UCF101 dataset [40]. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  8. 8

    Self-attention module for the features learning. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
  9. 9

    Residual behavior. by Yasir Khan Jadoon (21433231)

    Published 2025
    “…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”