يعرض 141 - 160 نتائج من 1,196 نتيجة بحث عن '(( significant decrease decrease ) OR ( significantly advanced decrease ))~', وقت الاستعلام: 0.44s تنقيح النتائج
  1. 141
  2. 142
  3. 143
  4. 144
  5. 145
  6. 146
  7. 147

    Study methods flow chart. حسب Luis Polo-Ferrero (20513578)

    منشور في 2025
    الموضوعات:
  8. 148
  9. 149
  10. 150
  11. 151
  12. 152

    S1 File - حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  13. 153

    Confusion matrix for ClinicalBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  14. 154

    Confusion matrix for LastBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  15. 155

    Student model architecture. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  16. 156

    Configuration of the LastBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  17. 157

    Confusion matrix for DistilBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  18. 158

    ROC curve for LastBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  19. 159

    Sample Posts from the ADHD dataset. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"
  20. 160

    Top-level overview for ADHD classification study. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …"