يعرض 21 - 40 نتائج من 56 نتيجة بحث عن '(( final phase process optimization algorithm ) OR ( binary step process optimization algorithm ))', وقت الاستعلام: 0.57s تنقيح النتائج
  1. 21

    The structure of the Resnet50. حسب Nour Eldeen Mahmoud Khalifa (19259450)

    منشور في 2024
    "…The third phase is the training and testing phase. Finally, the best-performing model was selected and compared with the currently established models (Alexnet, Squeezenet, Googlenet, Resnet50).…"
  2. 22

    The bottleneck residual block for Resnet50. حسب Nour Eldeen Mahmoud Khalifa (19259450)

    منشور في 2024
    "…The third phase is the training and testing phase. Finally, the best-performing model was selected and compared with the currently established models (Alexnet, Squeezenet, Googlenet, Resnet50).…"
  3. 23

    DeepDate model’s architecture design. حسب Nour Eldeen Mahmoud Khalifa (19259450)

    منشور في 2024
    "…The third phase is the training and testing phase. Finally, the best-performing model was selected and compared with the currently established models (Alexnet, Squeezenet, Googlenet, Resnet50).…"
  4. 24
  5. 25
  6. 26
  7. 27

    Comparison with existing SOTA techniques. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  8. 28

    Proposed inverted residual parallel block. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  9. 29

    Inverted residual bottleneck block. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  10. 30

    Sample classes from the HMDB51 dataset. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  11. 31

    Sample classes from UCF101 dataset [40]. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  12. 32

    Self-attention module for the features learning. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  13. 33

    Residual behavior. حسب Yasir Khan Jadoon (21433231)

    منشور في 2025
    "…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …"
  14. 34

    Overall framework diagram. حسب Yanhua Xian (21417128)

    منشور في 2025
    "…Secondly, addressing the issue of weight and threshold initialization in BPNN, the Coati Optimization Algorithm (COA) was employed to optimize the network (COA-BPNN). …"
  15. 35
  16. 36
  17. 37
  18. 38
  19. 39
  20. 40

    Table_1_Optimal Reopening Pathways With COVID-19 Vaccine Rollout and Emerging Variants of Concern.pdf حسب Yanyu Xiao (682443)

    منشور في 2021
    "…Our model framework and optimization strategies take into account the likely range of social contacts during different phases of a gradual reopening process and consider the uncertainties of these contact rates due to variations of individual behaviors and compliance. …"