Search alternatives:
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search), algorithm both (Expand Search)
python function » protein function (Expand Search)
where function » sphere function (Expand Search), gene function (Expand Search), wave function (Expand Search)
algorithms a » algorithms _ (Expand Search), algorithm a (Expand Search), algorithm _ (Expand Search)
a function » _ function (Expand Search)
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search), algorithm both (Expand Search)
python function » protein function (Expand Search)
where function » sphere function (Expand Search), gene function (Expand Search), wave function (Expand Search)
algorithms a » algorithms _ (Expand Search), algorithm a (Expand Search), algorithm _ (Expand Search)
a function » _ function (Expand Search)
-
61
Simulation results for average reward rate function using the UCB algorithm , where <i>A</i> = 100, <i>ℓ</i> = 20, <i>μ</i> = [0.75, …(×50), 0.5, …(×50)], and <i>T</i> = 1000, and .
Published 2022“…<p>Simulation results for average reward rate function using the UCB algorithm , where <i>A</i> = 100, <i>ℓ</i> = 20, <i>μ</i> = [0.75, …(×50), 0.5, …(×50)], and <i>T</i> = 1000, and .…”
-
62
-
63
-
64
-
65
Brief sketch of the quasi-attraction/alignment algorithm.
Published 2023“…The focal agent selects its next direction randomly based on . (D) A brief sketch of the avoidance algorithm. Upper: Each direction is extended to the repulsion area = {<b><i>r</i></b>||<b><i>r</i></b>| = <i>R</i>}, where is the minimal sphere cap that covers all points on . …”
-
66
-
67
Rosenbrock function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
68
Rosenbrock function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
69
Levy function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
70
Rastrigin function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
71
Levy function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
72
Rastrigin function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
73
Levy function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
74
Levy function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
75
Rastrigin function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
76
Rastrigin function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
77
Rosenbrock function losses for .
Published 2025“…This approach bridges the gap between model accuracy and optimization efficiency, offering a practical solution for optimizing non-differentiable machine learning models that can be extended to other tree-based ensemble algorithms. …”
-
78
NLDock: a Fast Nucleic Acid–Ligand Docking Algorithm for Modeling RNA/DNA–Ligand Complexes
Published 2021“…Here, we have developed a fast nucleic acid–ligand docking algorithm, named NLDock, by implementing our intrinsic scoring function ITScoreNL for nucleic acid–ligand interactions into a modified version of the MDock program. …”
-
79
-
80