Showing 461 - 480 results of 5,845 for search '(( significant decrease decrease ) OR ( significant ((we decrease) OR (a decrease)) ))~', query time: 0.50s Refine Results
  1. 461

    Flow Chart of Study Participant Selection. by Zhi Jin (3742471)

    Published 2025
    “…Notably, individuals with long sleep duration (>9 hours) had a significantly decreased risk of CVD (OR: 0.36, 95% CI: 0.15–0.85, P = 0.02) compared to those with shorter sleep durations.…”
  2. 462

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  3. 463

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  4. 464

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  5. 465

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  6. 466

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  7. 467

    Characteristics of the included studies. by Xin Liu (43569)

    Published 2025
    “…The combined findings showed that high-dose TXA was associated with a significant reduction in intraoperative blood loss [weighted mean difference (WMD) =  -215.48, 95% confidence interval (CI) (-367.58, -63.37), P <  0.001], as well as a decreased likelihood of transfusion [risk ratio (RR) =  0.40, 95% CI (0.30, 0.53), P <  0.001]. …”
  8. 468

    Summary of results. by Xin Liu (43569)

    Published 2025
    “…The combined findings showed that high-dose TXA was associated with a significant reduction in intraoperative blood loss [weighted mean difference (WMD) =  -215.48, 95% confidence interval (CI) (-367.58, -63.37), P <  0.001], as well as a decreased likelihood of transfusion [risk ratio (RR) =  0.40, 95% CI (0.30, 0.53), P <  0.001]. …”
  9. 469

    Bias risk assessments for each RCT study. by Xin Liu (43569)

    Published 2025
    “…The combined findings showed that high-dose TXA was associated with a significant reduction in intraoperative blood loss [weighted mean difference (WMD) =  -215.48, 95% confidence interval (CI) (-367.58, -63.37), P <  0.001], as well as a decreased likelihood of transfusion [risk ratio (RR) =  0.40, 95% CI (0.30, 0.53), P <  0.001]. …”
  10. 470

    The flow chart of studies selecting. by Xin Liu (43569)

    Published 2025
    “…The combined findings showed that high-dose TXA was associated with a significant reduction in intraoperative blood loss [weighted mean difference (WMD) =  -215.48, 95% confidence interval (CI) (-367.58, -63.37), P <  0.001], as well as a decreased likelihood of transfusion [risk ratio (RR) =  0.40, 95% CI (0.30, 0.53), P <  0.001]. …”
  11. 471

    Funnel plots for intraoperative blood loss. by Xin Liu (43569)

    Published 2025
    “…The combined findings showed that high-dose TXA was associated with a significant reduction in intraoperative blood loss [weighted mean difference (WMD) =  -215.48, 95% confidence interval (CI) (-367.58, -63.37), P <  0.001], as well as a decreased likelihood of transfusion [risk ratio (RR) =  0.40, 95% CI (0.30, 0.53), P <  0.001]. …”
  12. 472

    Bias risk assessments for each RCS study. by Xin Liu (43569)

    Published 2025
    “…The combined findings showed that high-dose TXA was associated with a significant reduction in intraoperative blood loss [weighted mean difference (WMD) =  -215.48, 95% confidence interval (CI) (-367.58, -63.37), P <  0.001], as well as a decreased likelihood of transfusion [risk ratio (RR) =  0.40, 95% CI (0.30, 0.53), P <  0.001]. …”
  13. 473

    S1 File - by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  14. 474

    Confusion matrix for ClinicalBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  15. 475

    Confusion matrix for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  16. 476

    Student model architecture. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  17. 477

    Configuration of the LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  18. 478

    Confusion matrix for DistilBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  19. 479

    ROC curve for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  20. 480

    Sample Posts from the ADHD dataset. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”