Search alternatives:
process optimization » model optimization (Expand Search)
design optimization » bayesian optimization (Expand Search)
image process » damage process (Expand Search), image processing (Expand Search), simple process (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
a design » _ design (Expand Search), co design (Expand Search)
process optimization » model optimization (Expand Search)
design optimization » bayesian optimization (Expand Search)
image process » damage process (Expand Search), image processing (Expand Search), simple process (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
a design » _ design (Expand Search), co design (Expand Search)
-
41
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
42
IRBMO vs. feature selection algorithm boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …”
-
43
-
44
Best optimizer results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
45
Best optimizer results of Adaboost.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
46
Best optimizer results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
47
Random forest with hyperparameter optimization.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
48
Best optimizer results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
49
Best optimizer results of KNN.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
50
-
51
Supplementary file 1_Encodings of the weighted MAX k-CUT problem on qubit systems.pdf
Published 2025“…This study explores encoding methods for MAX k-CUT on qubit systems by utilizing quantum approximate optimization algorithms (QAOA) and addressing the challenge of encoding integer values on quantum devices with binary variables. …”
-
52
Parameter settings of the comparison algorithms.
Published 2024“…In this paper, we present an improved mountain gazelle optimizer (IMGO) based on the newly proposed mountain gazelle optimizer (MGO) and design a binary version of IMGO (BIMGO) to solve the feature selection problem for medical data. …”
-
53
Best optimizer results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
54
Best optimizer result for Adaboost classifier.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
55
Best optimizer results for random forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
56
Best optimizer results of Decision tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
57
Best optimizer results of Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
58
Best optimizer results of Random Forest.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
59
Best optimizer result for Extra tree.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
60
An Example of a WPT-MEC Network.
Published 2025“…Numerical results validate the superiority of the proposed approach, demonstrating significant gains in computation performance and time efficiency compared to conventional techniques, making real-time and optimal offloading design truly viable even in a fast-fading environment.…”