Showing 1 - 20 results of 597 for search '(( learning ((we decrease) OR (nn decrease)) ) OR ( ai ((large decrease) OR (marked decrease)) ))', query time: 0.50s Refine Results
  1. 1

    Data Sheet 1_Emotional prompting amplifies disinformation generation in AI large language models.docx by Rasita Vinay (21006911)

    Published 2025
    “…Introduction<p>The emergence of artificial intelligence (AI) large language models (LLMs), which can produce text that closely resembles human-written content, presents both opportunities and risks. …”
  2. 2
  3. 3
  4. 4
  5. 5

    Overview of the WeARTolerance program. by Ana Beato (20489933)

    Published 2024
    “…The quantitative results from Phase 1 demonstrated a decreasing trend in all primary outcomes. In phase 2, participants acknowledged the activities’ relevance, reported overall satisfaction with the program, and showed great enthusiasm and willingness to learn more. …”
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10

    Machine Learning Models for High Explosive Crystal Density and Performance by Jack V. Davis (9175514)

    Published 2024
    “…Using these values, we trained machine learning models for the prediction of density, detonation velocity and detonation pressure. …”
  11. 11
  12. 12
  13. 13

    Data from: Colony losses of stingless bees increase in agricultural areas, but decrease in forested areas by Malena Sibaja Leyton (18400983)

    Published 2025
    “…On average, meliponiculturists lost 43.4 % of their stingless bee colonies annually, 33.3 % during the rainy season, and 22.0 % during the dry season. We found that colony losses during the rainy season decreased with higher abundance of forested areas and increased with higher abundance of agricultural area around meliponaries. …”
  14. 14

    Effect of active learning. by Jia Li (160557)

    Published 2025
    “…This paper introduces a novel classification algorithm, ASGBC, intended to tackle related challenges in diagnosing gallbladder cancer using B-ultrasound images. Firstly, we combine active learning with self-supervised learning to decrease the reliance on labeled data. …”
  15. 15

    Model and learning rule. by Janis Keck (21587252)

    Published 2025
    “…<b>(C), (D)</b> Invariance of learning rules with respect to temporal order. We plot synaptic weight change of a single synapse in a setup with a single pre- and postsynaptic neuron, respectively. …”
  16. 16

    A novel RNN architecture to improve the precision of ship trajectory predictions by Martha Dais Ferreira (18704596)

    Published 2025
    “…This type of monitoring often relies on the use of navigation systems such as the Automatic Identification System (AIS). AIS data has been used to support the defense teams when identifying equipment defects, locating suspicious activity, ensuring ship collision avoidance, and detecting hazardous events. …”
  17. 17
  18. 18

    Evaluation of the effectiveness of double task. by Fan Yang (1413)

    Published 2025
    “…The Spatial Attention Based Dual-Branch Information Fusion Block links these branches, enabling mutual benefit. Furthermore, we present a structured pruning method grounded in channel attention to decrease parameter count, mitigate overfitting, and uphold segmentation accuracy. …”
  19. 19

    Evaluation of the effectiveness of pruning. by Fan Yang (1413)

    Published 2025
    “…The Spatial Attention Based Dual-Branch Information Fusion Block links these branches, enabling mutual benefit. Furthermore, we present a structured pruning method grounded in channel attention to decrease parameter count, mitigate overfitting, and uphold segmentation accuracy. …”
  20. 20

    The summary of ablation experiment. by Fan Yang (1413)

    Published 2025
    “…The Spatial Attention Based Dual-Branch Information Fusion Block links these branches, enabling mutual benefit. Furthermore, we present a structured pruning method grounded in channel attention to decrease parameter count, mitigate overfitting, and uphold segmentation accuracy. …”