Search alternatives:
automatic decrease » dramatic decrease (Expand Search)
aromatic decrease » dramatic decrease (Expand Search)
Showing 81 - 100 results of 193 for search '((automatic decrease) OR (aromatic decrease))', query time: 0.29s Refine Results
  1. 81

    Room-Temperature Self-Healable Glassy Semicrystalline Polymers via Ionic Aggregations by Pengxiang Si (5676260)

    Published 2024
    “…Semicrystalline polymers constitute the largest fraction of industrial and engineering plastics but are difficult to automatically self-heal in their glassy state due to the frozen molecular chains. …”
  2. 82

    Room-Temperature Self-Healable Glassy Semicrystalline Polymers via Ionic Aggregations by Pengxiang Si (5676260)

    Published 2024
    “…Semicrystalline polymers constitute the largest fraction of industrial and engineering plastics but are difficult to automatically self-heal in their glassy state due to the frozen molecular chains. …”
  3. 83

    Room-Temperature Self-Healable Glassy Semicrystalline Polymers via Ionic Aggregations by Pengxiang Si (5676260)

    Published 2024
    “…Semicrystalline polymers constitute the largest fraction of industrial and engineering plastics but are difficult to automatically self-heal in their glassy state due to the frozen molecular chains. …”
  4. 84

    R project including metadata update. by Frederik Stein (22146203)

    Published 2025
    “…Without such amendments, the accumulation of contradictory species assignments within BINs will continue to rise and the reliability of specimen identification by BOLD will decrease.</p></div>…”
  5. 85

    Classes of errors and gaps in BOLD metadata. by Frederik Stein (22146203)

    Published 2025
    “…Without such amendments, the accumulation of contradictory species assignments within BINs will continue to rise and the reliability of specimen identification by BOLD will decrease.</p></div>…”
  6. 86
  7. 87
  8. 88
  9. 89
  10. 90
  11. 91
  12. 92

    Qualitative Examples of Corrected Sentences. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  13. 93

    Ablation Experiment Results. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  14. 94

    Meaning of Each Model. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  15. 95

    Overview of Related Work. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  16. 96

    Parameter Settings. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  17. 97

    Comparison with Advanced Models. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  18. 98

    Overall Model Architecture. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  19. 99

    Baseline Transformer model configuration. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  20. 100

    Comparison Results on CoNLL-2014 Dataset. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”