Search alternatives:
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search), algorithm both (Expand Search)
python function » protein function (Expand Search)
algorithm its » algorithm i (Expand Search), algorithm etc (Expand Search), algorithm iqa (Expand Search)
its function » i function (Expand Search), loss function (Expand Search), cost function (Expand Search)
algorithm fc » algorithm etc (Expand Search), algorithm pca (Expand Search), algorithms mc (Expand Search)
fc function » _ function (Expand Search), a function (Expand Search), 1 function (Expand Search)
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search), algorithm both (Expand Search)
python function » protein function (Expand Search)
algorithm its » algorithm i (Expand Search), algorithm etc (Expand Search), algorithm iqa (Expand Search)
its function » i function (Expand Search), loss function (Expand Search), cost function (Expand Search)
algorithm fc » algorithm etc (Expand Search), algorithm pca (Expand Search), algorithms mc (Expand Search)
fc function » _ function (Expand Search), a function (Expand Search), 1 function (Expand Search)
-
161
Parameter settings for metaheuristic algorithms.
Published 2025“…In the experimental section, we validate the efficiency and superiority of LSWOA by comparing it with outstanding metaheuristic algorithms and excellent WOA variants. The experimental results show that LSWOA exhibits significant optimization performance on the benchmark functions with various dimensions. …”
-
162
Mean training time of different algorithms.
Published 2023“…The results show: (1) the global convergence probability of SGWO was 1, and its process was a finite homogeneous Markov chain with an absorption state; (2) SGWO not only has better optimization performance when solving complex functions of different dimensions, but also when applied to Elman for parameter optimization, SGWO can significantly optimize the network structure and SGWO-Elman has accurate prediction performance.…”
-
163
Algorithm ranking under different dimensions.
Published 2023“…The results show: (1) the global convergence probability of SGWO was 1, and its process was a finite homogeneous Markov chain with an absorption state; (2) SGWO not only has better optimization performance when solving complex functions of different dimensions, but also when applied to Elman for parameter optimization, SGWO can significantly optimize the network structure and SGWO-Elman has accurate prediction performance.…”
-
164
-
165
-
166
Parameter sets of the chosen algorithms.
Published 2024“…The IERWHO algorithm is an improved Wild Horse optimization (WHO) algorithm that combines the concepts of chaotic sequence factor, nonlinear factor, and inertia weights factor. …”
-
167
The flow chart of IERWHO algorithm.
Published 2024“…The IERWHO algorithm is an improved Wild Horse optimization (WHO) algorithm that combines the concepts of chaotic sequence factor, nonlinear factor, and inertia weights factor. …”
-
168
The flow chart of WHO algorithm.
Published 2024“…The IERWHO algorithm is an improved Wild Horse optimization (WHO) algorithm that combines the concepts of chaotic sequence factor, nonlinear factor, and inertia weights factor. …”
-
169
Compare algorithm parameter settings.
Published 2025“…Experimental validation shows that on 23 benchmark functions and the CEC2022 test suite, MESBOA significantly outperforms the original Secretary Bird Optimization Algorithm (SBOA) and other comparative algorithms (such as GWO, WOA, PSO, etc.) in terms of convergence speed, solution accuracy, and stability. …”
-
170
CEC2017 test function test results.
Published 2025“…The optimal individual’s position is updated by randomly selecting from these factors, enhancing the algorithm’s ability to attain the global optimum and increasing its overall robustness. …”
-
171
-
172
Flowchart of the specific incarnation of the BO algorithm used in the experiments.
Published 2020“…To choose the next pipeline configuration to evaluate, the BO algorithm uses an Expected Improvement function to trade off maximisation of QS with the need to fully learn the GP. …”
-
173
-
174
-
175
-
176
Description of unimodal benchmark functions.
Published 2024“…<div><p>This paper proposes the Modulated Whale Optimization Algorithm(MWOA), an innovative metaheuristic algorithm derived from the classic WOA and tailored for bionics-inspired optimization. …”
-
177
Description of multimodal benchmark functions.
Published 2024“…<div><p>This paper proposes the Modulated Whale Optimization Algorithm(MWOA), an innovative metaheuristic algorithm derived from the classic WOA and tailored for bionics-inspired optimization. …”
-
178
-
179
Statistical results of various algorithms.
Published 2025“…Secondly, the WOA’s position update formula was modified by incorporating inertia weight <i>ω</i> and enhancing convergence factor <i>α</i>, thus improving its capability for local search. Furthermore, inspired by the grey wolf optimization algorithm, use 3 excellent particle surround strategies instead of the original random selecting particles. …”
-
180