Showing 1 - 20 results of 1,119 for search '(( learning sizes decrease ) OR ( b ((large decrease) OR (larger decrease)) ))', query time: 0.41s Refine Results
  1. 1
  2. 2

    The introduction of mutualisms into assembled communities increases their connectance and complexity while decreasing their richness. by Gui Araujo (22170819)

    Published 2025
    “…When they stop being introduced in further assembly events (i.e. introduced species do not carry any mutualistic interactions), their proportion slowly decreases with successive invasions. (B) Even though higher proportions of mutualism promote higher richness, introducing this type of interaction into already assembled large communities promotes a sudden drop in richness, while stopping mutualism promotes a slight boost in richness increase. …”
  3. 3
  4. 4
  5. 5

    Biases in larger populations. by Sander W. Keemink (21253563)

    Published 2025
    “…<p>(<b>A</b>) Maximum absolute bias vs the number of neurons in the population for the Bayesian decoder. …”
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14

    Geographical distribution of large cities and small cities. by Saul Estrin (8629173)

    Published 2024
    “…The Figure reveals two patterns: 1) the maximum level of innovation is higher in large cities (2.53) than in small cities (2.02); 2) among large cities in <b>a</b>, innovation levels in general decrease with nightlight density. …”
  15. 15
  16. 16
  17. 17

    Strategy parameters across development for female mice in set size = 2 and set size = 4 from winning computational model. by Juliana Chase (20469427)

    Published 2024
    “…RL parameter <i>α</i><sub>+</sub> learning rate and decision noise parameter softmax <i>β</i> were stable across development in both set size = 2 (A-B) and set size = 4(F-G). …”
  18. 18

    Strategy parameters across development for male mice in set size = 2 and set size = 4 from winning computational model. by Juliana Chase (20469427)

    Published 2024
    “…RL parameter <i>α</i><sub>+</sub> learning rate and decision noise parameter softmax <i>β</i> were stable across development in both set size = 2 (A-B) and set size = 4 (F-G). …”
  19. 19
  20. 20