Showing 1 - 20 results of 204 for search '(( learning leads decrease ) OR ( ct ((largest decrease) OR (marked decrease)) ))', query time: 0.47s Refine Results
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17

    Model and learning rule. by Janis Keck (21587252)

    Published 2025
    “…The right column has the same pre- and postsynaptic activities as the left column, only in reverse order. In <b>(C)</b>, the learning rule with parameters is used, while in <b>(D)</b> Only in the latter the synaptic weight changes are preserved (in reverse order), while in <b>(C)</b>, postsynaptic activity before presynaptic activity leads to a net weight decrease. …”
  18. 18

    Evaluation of the effectiveness of double task. by Fan Yang (1413)

    Published 2025
    “…Yet, these methods still encounter two primary challenges. Firstly, deep learning methods are sensitive to weak edges. Secondly, the high cost of annotating medical image data results in a lack of labeled data, leading to overfitting during model training. …”
  19. 19

    Evaluation of the effectiveness of pruning. by Fan Yang (1413)

    Published 2025
    “…Yet, these methods still encounter two primary challenges. Firstly, deep learning methods are sensitive to weak edges. Secondly, the high cost of annotating medical image data results in a lack of labeled data, leading to overfitting during model training. …”
  20. 20

    The summary of ablation experiment. by Fan Yang (1413)

    Published 2025
    “…Yet, these methods still encounter two primary challenges. Firstly, deep learning methods are sensitive to weak edges. Secondly, the high cost of annotating medical image data results in a lack of labeled data, leading to overfitting during model training. …”