Showing 21 - 40 results of 1,036 for search '(( algorithm brain function ) OR ( algorithm both function ))*', query time: 0.31s Refine Results
  1. 21
  2. 22
  3. 23
  4. 24
  5. 25

    Efficient Algorithms for GPU Accelerated Evaluation of the DFT Exchange-Correlation Functional by Ryan Stocks (16867476)

    Published 2025
    “…Kohn–Sham density functional theory (KS-DFT) has become a cornerstone for studying the electronic structure of molecules and materials. …”
  6. 26
  7. 27

    RFAConv working principle. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  8. 28

    PConv working principle. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  9. 29

    Improvement of SPPF to SPPF-R process. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  10. 30

    PR comparison on RSOD dataset. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  11. 31

    Ablation study on the RSOD dataset. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  12. 32

    Structure and working principle of LI-YOLOv8. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  13. 33

    C2f-E improvement process. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  14. 34

    Structure of Detect and GP-Detect. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  15. 35

    YOLOv8 structure and working principle. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  16. 36

    Improvement of CBS to CBR process. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  17. 37

    EMA attention mechanism working principle. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  18. 38

    Ablation study on the NWPU VHR-10 dataset. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  19. 39

    GSConv working principle. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”
  20. 40

    PR comparison on NWPU VHR-10 dataset. by Pingping Yan (462509)

    Published 2025
    “…Using YOLOv8n as the baseline algorithm, the activation function SiLU in the CBS at the backbone network’s SPPF is replaced with ReLU, which reduces interdependencies among parameters. …”