Search alternatives:
automatic decrease » dramatic decrease (Expand Search)
aromatic decrease » dramatic decrease (Expand Search)
automatic decrease » dramatic decrease (Expand Search)
aromatic decrease » dramatic decrease (Expand Search)
-
81
Room-Temperature Self-Healable Glassy Semicrystalline Polymers via Ionic Aggregations
Published 2024“…Semicrystalline polymers constitute the largest fraction of industrial and engineering plastics but are difficult to automatically self-heal in their glassy state due to the frozen molecular chains. …”
-
82
Room-Temperature Self-Healable Glassy Semicrystalline Polymers via Ionic Aggregations
Published 2024“…Semicrystalline polymers constitute the largest fraction of industrial and engineering plastics but are difficult to automatically self-heal in their glassy state due to the frozen molecular chains. …”
-
83
Room-Temperature Self-Healable Glassy Semicrystalline Polymers via Ionic Aggregations
Published 2024“…Semicrystalline polymers constitute the largest fraction of industrial and engineering plastics but are difficult to automatically self-heal in their glassy state due to the frozen molecular chains. …”
-
84
R project including metadata update.
Published 2025“…Without such amendments, the accumulation of contradictory species assignments within BINs will continue to rise and the reliability of specimen identification by BOLD will decrease.</p></div>…”
-
85
Classes of errors and gaps in BOLD metadata.
Published 2025“…Without such amendments, the accumulation of contradictory species assignments within BINs will continue to rise and the reliability of specimen identification by BOLD will decrease.</p></div>…”
-
86
-
87
-
88
-
89
-
90
-
91
-
92
Qualitative Examples of Corrected Sentences.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
93
Ablation Experiment Results.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
94
Meaning of Each Model.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
95
Overview of Related Work.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
96
Parameter Settings.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
97
Comparison with Advanced Models.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
98
Overall Model Architecture.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
99
Baseline Transformer model configuration.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
100
Comparison Results on CoNLL-2014 Dataset.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”