Showing 261 - 280 results of 609 for search '(( significant decrease decrease ) OR ( significant attention decrease ))~', query time: 0.31s Refine Results
  1. 261
  2. 262

    Data Sheet 1_Visual attention during non-immersive virtual reality balance training in older adults with mild to moderate cognitive impairment: an eye-tracking study.pdf by Marcos Maldonado-Díaz (11221102)

    Published 2025
    “…Eye-tracking data indicated increased fixation stability and decreased pupil diameter, suggesting more efficient attention allocation during motor tasks.…”
  3. 263

    Screening flowchart. by Xuechun Fan (5439470)

    Published 2024
    “…<div><p>Introduction</p><p>Diabetic peripheral neuropathy (DPN), a prevalent complication among individuals diagnosed with type 2 diabetes, has a significant impact on both the well-being of patients and their financial situation. …”
  4. 264

    Search strategy in PubMed. by Xuechun Fan (5439470)

    Published 2024
    “…<div><p>Introduction</p><p>Diabetic peripheral neuropathy (DPN), a prevalent complication among individuals diagnosed with type 2 diabetes, has a significant impact on both the well-being of patients and their financial situation. …”
  5. 265

    Qualitative Examples of Corrected Sentences. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  6. 266

    Ablation Experiment Results. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  7. 267

    Meaning of Each Model. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  8. 268

    Overview of Related Work. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  9. 269

    Parameter Settings. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  10. 270

    Comparison with Advanced Models. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  11. 271

    Overall Model Architecture. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  12. 272

    Baseline Transformer model configuration. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  13. 273

    Comparison Results on CoNLL-2014 Dataset. by Yutong Liu (4027994)

    Published 2025
    “…This study proposes an enhanced model based on Bidirectional Encoder Representations from Transformers (BERT), combined with a dependency self-attention mechanism, to automatically detect and correct textual errors in the translation process. …”
  14. 274
  15. 275

    Table 2_An 8-week 24-form Tai Chi intervention on cognition in Chinese college students overusing short videos: a randomized controlled trial.xlsx by Yu-fan Li (22162804)

    Published 2025
    “…Post-intervention, TCG reaction times significantly decreased (p < 0.05), aligning with LCG and differing significantly from OCG. …”
  16. 276

    Data Sheet 1_Effects of methylphenidate on children with attention deficit hyperactivity disorder: a study using clinical and multimodal approaches including go/no-go task and func... by Fang Shen (495200)

    Published 2025
    “…Concurrently, they showed significantly increased accuracy on both go and no-go trials, along with significantly decreased reaction times on go trials. …”
  17. 277

    Theoretical framework. by Dinku Mechal (21273002)

    Published 2025
    “…Assumptions of linear multivariate regression were checked and the level of significance determined at a 95% CI and p-value <0.05. …”
  18. 278

    Supplementary file survey questioner annex. by Dinku Mechal (21273002)

    Published 2025
    “…Assumptions of linear multivariate regression were checked and the level of significance determined at a 95% CI and p-value <0.05. …”
  19. 279
  20. 280

    Timeline for study enrollment. by Saeun Park (20410160)

    Published 2024
    “…Urgent attention and action are needed to support the mental well-being of this vulnerable population. …”