يعرض 1 - 20 نتائج من 27 نتيجة بحث عن '(( binary phase process optimization algorithm ) OR ( final phase process optimization algorithm ))', وقت الاستعلام: 0.49s تنقيح النتائج
  1. 1
  2. 2
  3. 3

    Proposed architecture testing phase. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  4. 4

    Classification performance after optimization. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  5. 5

    ANOVA test for optimization results. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  6. 6

    Wilcoxon test results for optimization. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  7. 7

    Wilcoxon test results for feature selection. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  8. 8

    Feature selection metrics and their definitions. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  9. 9

    Statistical summary of all models. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  10. 10

    Feature selection results. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  11. 11

    ANOVA test for feature selection. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  12. 12

    Classification performance of ML and DL models. حسب Amal H. Alharbi (21755906)

    منشور في 2025
    "…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …"
  13. 13
  14. 14
  15. 15

    Comparison with existing SOTA techniques. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  16. 16

    Proposed inverted residual parallel block. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  17. 17

    Inverted residual bottleneck block. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  18. 18

    Sample classes from the HMDB51 dataset. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  19. 19

    Sample classes from UCF101 dataset [40]. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  20. 20

    Self-attention module for the features learning. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"