Search alternatives:
aromatic » somatic (Expand Search)
decrease » decreased (Expand Search), increase (Expand Search)
automatic » automated (Expand Search)
aromatic » somatic (Expand Search)
decrease » decreased (Expand Search), increase (Expand Search)
automatic » automated (Expand Search)
-
281
-
282
-
283
-
284
-
285
-
286
-
287
Qualitative Examples of Corrected Sentences.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
288
Ablation Experiment Results.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
289
Meaning of Each Model.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
290
Overview of Related Work.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
291
Parameter Settings.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
292
Comparison with Advanced Models.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
293
Overall Model Architecture.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
294
Baseline Transformer model configuration.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
295
Comparison Results on CoNLL-2014 Dataset.
Published 2025“…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
-
296
-
297
-
298
-
299
-
300
Appendix figures.
Published 2025“…Many techniques for automatic cell segmentation exist, but these methods often require annotated datasets, model retraining, and associated technical expertise.…”