Search alternatives:
resource optimization » resource utilization (Expand Search), resource utilisation (Expand Search), resource limitations (Expand Search)
feature optimization » feature elimination (Expand Search), structure optimization (Expand Search), iterative optimization (Expand Search)
task resource » a resource (Expand Search)
also feature » a feature (Expand Search), all features (Expand Search), wise feature (Expand Search)
binary task » binary mask (Expand Search)
resource optimization » resource utilization (Expand Search), resource utilisation (Expand Search), resource limitations (Expand Search)
feature optimization » feature elimination (Expand Search), structure optimization (Expand Search), iterative optimization (Expand Search)
task resource » a resource (Expand Search)
also feature » a feature (Expand Search), all features (Expand Search), wise feature (Expand Search)
binary task » binary mask (Expand Search)
-
1
-
2
-
3
-
4
-
5
-
6
-
7
-
8
-
9
-
10
-
11
IRBMO vs. feature selection algorithm boxplot.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
12
CDF of task latency, approximated as the inverse of the achieved computation rate.
Published 2025Subjects: -
13
-
14
-
15
-
16
-
17
The Pseudo-Code of the IRBMO Algorithm.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
-
18
Feature selection results.
Published 2025“…Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
19
Algorithm for generating hyperparameter.
Published 2024“…Motivated by the above, in this proposal, we design an improved model to predict the existence of respiratory disease among patients by incorporating hyperparameter optimization and feature selection. To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
20
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”