Showing 1,641 - 1,660 results of 21,342 for search '(( significant health decrease ) OR ( significant decrease decrease ))', query time: 0.47s Refine Results
  1. 1641
  2. 1642
  3. 1643
  4. 1644
  5. 1645

    S1 File - by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  6. 1646

    Confusion matrix for ClinicalBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  7. 1647

    Confusion matrix for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  8. 1648

    Student model architecture. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  9. 1649

    Configuration of the LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  10. 1650

    Confusion matrix for DistilBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  11. 1651

    ROC curve for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  12. 1652

    Sample Posts from the ADHD dataset. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  13. 1653

    Top-level overview for ADHD classification study. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…The study emphasizes the possibilities of knowledge distillation to produce effective models fit for use in resource-limited conditions, hence advancing NLP and mental health diagnosis. Furthermore underlined by the considerable decrease in model size without appreciable performance loss is the lower computational resources needed for training and deployment, hence facilitating greater applicability. …”
  14. 1654
  15. 1655
  16. 1656
  17. 1657
  18. 1658
  19. 1659
  20. 1660