Showing 2,181 - 2,200 results of 8,689 for search 'significant ((((((we decrease) OR (teer decrease))) OR (nn decrease))) OR (mean decrease))', query time: 0.57s Refine Results
  1. 2181
  2. 2182
  3. 2183
  4. 2184
  5. 2185
  6. 2186
  7. 2187
  8. 2188
  9. 2189
  10. 2190

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  11. 2191

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  12. 2192

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  13. 2193

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  14. 2194

    Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane by Ching Yoong Loh (17863097)

    Published 2025
    “…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
  15. 2195
  16. 2196
  17. 2197
  18. 2198

    S1 File - by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  19. 2199

    Confusion matrix for ClinicalBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
  20. 2200

    Confusion matrix for LastBERT model. by Ahmed Akib Jawad Karim (20427740)

    Published 2025
    “…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”