Search alternatives:
process optimization » model optimization (Expand Search)
all optimization » art optimization (Expand Search), ai optimization (Expand Search), whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based all » based small (Expand Search), based cell (Expand Search), based ap (Expand Search)
a process » _ process (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
process optimization » model optimization (Expand Search)
all optimization » art optimization (Expand Search), ai optimization (Expand Search), whale optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based all » based small (Expand Search), based cell (Expand Search), based ap (Expand Search)
a process » _ process (Expand Search)
binary a » binary _ (Expand Search), binary b (Expand Search), hilary a (Expand Search)
-
41
Classification performance after optimization.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
42
ANOVA test for optimization results.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
43
Wilcoxon test results for optimization.
Published 2025“…The proposed approach integrates binary feature selection and metaheuristic optimization into a unified optimization process, effectively balancing exploration and exploitation to handle complex, high-dimensional datasets. …”
-
44
-
45
The flowchart of the proposed algorithm.
Published 2024“…To overcome this limitation, recent advancements have introduced multi-objective evolutionary algorithms for ATS. This study proposes an enhancement to the performance of ATS through the utilization of an improved version of the Binary Multi-Objective Grey Wolf Optimizer (BMOGWO), incorporating mutation. …”
-
46
datasheet1_Graph Neural Networks for Maximum Constraint Satisfaction.pdf
Published 2021“…We introduce a graph neural network architecture for solving such optimization problems. The architecture is generic; it works for all binary constraint satisfaction problems. …”
-
47
Datasets and their properties.
Published 2023“…The approach used in this study designed a sub-population selective mechanism that dynamically assigns individuals to a 2-level optimization process. …”
-
48
Parameter settings.
Published 2023“…The approach used in this study designed a sub-population selective mechanism that dynamically assigns individuals to a 2-level optimization process. …”
-
49
-
50
-
51
-
52
-
53
-
54
Secure MANET routing with blockchain-enhanced latent encoder coupled GANs and BEPO optimization
Published 2025“…The performance of the proposed LEGAN-BEPO-BCMANET technique attains 29.786%, 19.25%, 22.93%, 27.21%, 31.02%, 26.91%, and 25.61% greater throughput, compared to existing methods like Blockchain-based BATMAN protocol utilizing MANET with an ensemble algorithm (BATMAN-MANET), Block chain-based trusted distributed routing scheme with optimized dropout ensemble extreme learning neural network in MANET (DEELNN-MANET), A secured trusted routing utilizing structure of a new directed acyclic graph-blockchain in MANET internet of things environment (DAG-MANET), An Optimized Link State Routing Protocol with Blockchain Framework for Efficient Video-Packet Transmission and Security over MANET (OLSRP-MANET), Auto-metric Graph Neural Network based Blockchain Technology for Protected Dynamic Optimum Routing in MANET (AGNN-MANET) and Data security-based routing in MANETs under key management process (DSR-MANET) respectively.…”
-
55
Results of KNN.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
56
Comparison of key techniques in their literature.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
57
Ensemble model architecture.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
58
SHAP analysis mean value.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
59
Proposed methodology.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”
-
60
Comparison table of the proposed model.
Published 2024“…The results obtained show that the proposed model has superior prediction accuracy in comparison to its counterparts. Moreover, among all the hyperparameter-optimized algorithms, adaboost algorithm outperformed all the other hyperparameter-optimized algorithms. …”