Search alternatives:
design optimization » bayesian optimization (Expand Search)
where optimization » whale optimization (Expand Search), phase optimization (Expand Search), other optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
data where » data were (Expand Search), dataset where (Expand Search)
design optimization » bayesian optimization (Expand Search)
where optimization » whale optimization (Expand Search), phase optimization (Expand Search), other optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
data where » data were (Expand Search), dataset where (Expand Search)
-
1
Proposed Algorithm.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
2
Comparisons between ADAM and NADAM optimizers.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
3
-
4
-
5
DE algorithm flow.
Published 2025“…<div><p>To solve the problems of insufficient global optimization ability and easy loss of population diversity in building interior layout design, this study proposes a novel layout optimization model integrating interactive genetic algorithm and improved differential evolutionary algorithm to improve the global optimization ability and maintain population diversity in building layout design. …”
-
6
Test results of different algorithms.
Published 2025“…<div><p>To solve the problems of insufficient global optimization ability and easy loss of population diversity in building interior layout design, this study proposes a novel layout optimization model integrating interactive genetic algorithm and improved differential evolutionary algorithm to improve the global optimization ability and maintain population diversity in building layout design. …”
-
7
MSE for ILSTM algorithm in binary classification.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
8
An Example of a WPT-MEC Network.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
9
Related Work Summary.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
10
Simulation parameters.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
11
Training losses for N = 10.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
12
Normalized computation rate for N = 10.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
13
Summary of Notations Used in this paper.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
14
Algorithm for generating hyperparameter.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
15
-
16
Results of machine learning algorithm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
17
ROC comparison of machine learning algorithm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
18
-
19
Best optimizer results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
20
Best optimizer results of Adaboost.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”