Search alternatives:
significant decrease » significant increase (Expand Search), significantly increased (Expand Search)
challenge » challenges (Expand Search)
significant decrease » significant increase (Expand Search), significantly increased (Expand Search)
challenge » challenges (Expand Search)
-
4401
Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane
Published 2025“…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
-
4402
-
4403
-
4404
-
4405
-
4406
-
4407
-
4408
-
4409
-
4410
-
4411
-
4412
S1 File -
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4413
Confusion matrix for ClinicalBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4414
Confusion matrix for LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4415
Student model architecture.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4416
Configuration of the LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4417
Confusion matrix for DistilBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4418
ROC curve for LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4419
Sample Posts from the ADHD dataset.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
4420
Top-level overview for ADHD classification study.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”