Overall Model Architecture.

<div><p>Improving translation quality and efficiency is one of the key challenges in the field of Natural Language Processing (NLP). This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention me...

Full description

Saved in:
Bibliographic Details
Main Author: Yutong Liu (4027994) (author)
Other Authors: Shile Zhang (9617865) (author)
Published: 2025
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!