Showing 321 - 340 results of 18,176 for search 'significantly ((((we decrease) OR (a decrease))) OR (greater decrease))', query time: 0.64s Refine Results
  1. 321
  2. 322
  3. 323
  4. 324

    MiR-129-5p levels were decreased in mouse depression models. by Qiaozhen Qin (13159201)

    Published 2025
    “…</b> The KEGG pathway analysis identifies the number of target genes within each pathway. Those with a p-value < 0.05 are considered significant. <b>L.…”
  5. 325
  6. 326

    NgR1 KO mice exhibited an increase in excitatory synapses and a decrease in inhibitory synapses, indicating an imbalance of synaptic transmission. by Jinwei Zhang (462455)

    Published 2025
    “…The inhibitory synaptic density of NgR1 mice showed a significant decrease when compared to WT mice (***P <  0.001). …”
  7. 327
  8. 328
  9. 329
  10. 330

    Advancing Circular Bioeconomy through a Systems-Level Assessment of Food Waste and Industrial Sludge Codigestion by Md. Nizam Uddin (21632518)

    Published 2025
    “…Overall, codigesting FW with PPMS is revealed to be a sustainable waste management option to decrease landfill disposal of valuable organic waste.…”
  11. 331

    S1 File - by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  12. 332

    Cohort characteristics. by Vincent Pey (21433304)

    Published 2025
    “…</p><p>Results</p><p>Upon initiation of CPB we observed a significant decrease in arterial whole blood redox potential (101.90 mV + /- 11.52 vs. 41.80 mV + /- 10,26; p < 0.0001). …”
  13. 333

    Confusion matrix for ClinicalBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  14. 334

    Confusion matrix for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  15. 335

    Student model architecture. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  16. 336

    Configuration of the LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  17. 337

    Confusion matrix for DistilBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  18. 338

    ROC curve for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  19. 339

    Sample Posts from the ADHD dataset. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  20. 340

    Top-level overview for ADHD classification study. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”