يعرض 1,441 - 1,460 نتائج من 21,342 نتيجة بحث عن '(( significant ((concern decrease) OR (content decreased)) ) OR ( significant decrease decrease ))', وقت الاستعلام: 0.45s تنقيح النتائج
  1. 1441
  2. 1442
  3. 1443
  4. 1444
  5. 1445
  6. 1446

    Fig 5 - حسب Mísia Helena da Silva Ferro (20623666)

    منشور في 2025
    الموضوعات:
  7. 1447
  8. 1448
  9. 1449
  10. 1450
  11. 1451
  12. 1452
  13. 1453
  14. 1454

    Sequence analysis results. حسب Oejin Shin (20855617)

    منشور في 2025
    الموضوعات:
  15. 1455
  16. 1456
  17. 1457

    Descriptive analysis. حسب Oejin Shin (20855617)

    منشور في 2025
    الموضوعات:
  18. 1458

    S1 File - حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…After the model creation, we applied the resulting model, LastBERT, to a real-world task—classifying severity levels of Attention Deficit Hyperactivity Disorder (ADHD)-related concerns from social media text data. Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …"
  19. 1459

    Confusion matrix for ClinicalBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…After the model creation, we applied the resulting model, LastBERT, to a real-world task—classifying severity levels of Attention Deficit Hyperactivity Disorder (ADHD)-related concerns from social media text data. Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …"
  20. 1460

    Confusion matrix for LastBERT model. حسب Ahmed Akib Jawad Karim (20427740)

    منشور في 2025
    "…After the model creation, we applied the resulting model, LastBERT, to a real-world task—classifying severity levels of Attention Deficit Hyperactivity Disorder (ADHD)-related concerns from social media text data. Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …"