Search alternatives:
significant decrease » significant increase (Expand Search), significantly increased (Expand Search)
new decrease » nn decrease (Expand Search), mean decrease (Expand Search), teer decrease (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), nn decrease (Expand Search)
significant decrease » significant increase (Expand Search), significantly increased (Expand Search)
new decrease » nn decrease (Expand Search), mean decrease (Expand Search), teer decrease (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), nn decrease (Expand Search)
-
1601
Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane
Published 2025“…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
-
1602
Dynamic Covalent Chemistry Enabled Closed-Loop Recycling of Thermally Modified Polymer Membrane
Published 2025“…Thermal and mechanical characterizations confirmed the great stability of the membranes, with the Diels–Alder reaction enabling depolymerization and reformation of the network without causing significant degradation. Additionally, the RFMs were recycled the third time, maintaining the fluxes (752 to 823 LMH) from the previous generation with a slight decrease in separation efficiency in dichloromethane-water emulsion separation (98.3 to 97%). …”
-
1603
S1 File -
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1604
Confusion matrix for ClinicalBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1605
Confusion matrix for LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1606
Student model architecture.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1607
Configuration of the LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1608
Confusion matrix for DistilBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1609
ROC curve for LastBERT model.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1610
Sample Posts from the ADHD dataset.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1611
Top-level overview for ADHD classification study.
Published 2025“…Referring to LastBERT, a customized student BERT model, we significantly lowered model parameters from 110 million BERT base to 29 million-resulting in a model approximately 73.64% smaller. …”
-
1612
-
1613
-
1614
-
1615
-
1616
-
1617
-
1618
-
1619
-
1620