Search alternatives:
complex optimization » convex optimization (Expand Search), whale optimization (Expand Search), wolf optimization (Expand Search)
based complex » layer complex (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based swarm » based sars (Expand Search), based smart (Expand Search), based arm (Expand Search)
complex optimization » convex optimization (Expand Search), whale optimization (Expand Search), wolf optimization (Expand Search)
based complex » layer complex (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based swarm » based sars (Expand Search), based smart (Expand Search), based arm (Expand Search)
-
21
The Pseudo-Code of the IRBMO Algorithm.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
22
IRBMO vs. meta-heuristic algorithms boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
23
IRBMO vs. feature selection algorithm boxplot.
Published 2025“…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
-
24
Summary of LITNET-2020 dataset.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
25
SHAP analysis for LITNET-2020 dataset.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
26
Comparison of intrusion detection systems.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
27
Parameter setting for CBOA and PSO.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
28
NSL-KDD dataset description.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
29
The architecture of LSTM cell.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
30
The architecture of ILSTM.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
31
Parameter setting for LSTM.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
32
LITNET-2020 data splitting approach.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
33
Transformation of symbolic features in NSL-KDD.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
34
An Example of a WPT-MEC Network.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
35
Related Work Summary.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
36
Simulation parameters.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
37
Training losses for N = 10.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
38
Normalized computation rate for N = 10.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
39
Summary of Notations Used in this paper.
Published 2025“…EHRL integrates Reinforcement Learning (RL) with Deep Neural Networks (DNNs) to dynamically optimize binary offloading decisions, which in turn obviates the requirement for manually labeled training data and thus avoids the need for solving complex optimization problems repeatedly. …”
-
40
Triplet Matching for Estimating Causal Effects With Three Treatment Arms: A Comparative Study of Mortality by Trauma Center Level
Published 2021“…Few studies, however, have used matching designs with more than two groups, due to the complexity of matching algorithms. We fill the gap by developing an iterative matching algorithm for the three-group setting. …”