Search alternatives:
models optimization » model optimization (Expand Search), process optimization (Expand Search), wolf optimization (Expand Search)
based optimization » whale optimization (Expand Search)
primary data » primary care (Expand Search)
models optimization » model optimization (Expand Search), process optimization (Expand Search), wolf optimization (Expand Search)
based optimization » whale optimization (Expand Search)
primary data » primary care (Expand Search)
-
61
Data used in this study.
Published 2024“…In the hybrid model of this paper, the choice was made to use the Densenet architecture of CNN models with LightGBM as the primary model. …”
-
62
Accuracy comparison of proposed and other models.
Published 2024“…Next, we employ batch normalization to smooth and enhance the collected data, followed by feature extraction using the AlexNet model. …”
-
63
-
64
-
65
-
66
A portfolio selection model based on the knapsack problem under uncertainty
Published 2019“…The resulted model is converted into a parametric linear programming model in which the decision maker is able to determine the optimism threshold. …”
-
67
DEM error verified by airborne data.
Published 2024“…In the hybrid model of this paper, the choice was made to use the Densenet architecture of CNN models with LightGBM as the primary model. …”
-
68
Error of models.
Published 2024“…In the hybrid model of this paper, the choice was made to use the Densenet architecture of CNN models with LightGBM as the primary model. …”
-
69
-
70
Iteration curve of the optimization process.
Published 2025“…The load-bearing mechanism of the proposed steel platform was analyzed theoretically, and finite element analysis (FEA) was employed to evaluate the stresses and deflections of key members. A particle swarm optimization (PSO) algorithm was integrated with the FEA model to optimize the cross-sectional dimensions of the primary beams, secondary beams, and foundation boxes, achieving a balance between load-bearing capacity and cost efficiency. …”
-
71
Testing results for classifying AD, MCI and NC.
Published 2024“…The study introduced a scheme for enhancing images to improve the quality of the datasets. Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
-
72
Summary of existing CNN models.
Published 2024“…The study introduced a scheme for enhancing images to improve the quality of the datasets. Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
-
73
The prediction error of each model.
Published 2025“…The model is developed and validated using data from 159 debris flow-prone gullies, integrating deep convolutional, recurrent, and attention-based architectures, with hyperparameters autonomously optimized by IKOA. …”
-
74
Results for model hyperparameter values.
Published 2025“…The model is developed and validated using data from 159 debris flow-prone gullies, integrating deep convolutional, recurrent, and attention-based architectures, with hyperparameters autonomously optimized by IKOA. …”
-
75
Stability analysis of each model.
Published 2025“…The model is developed and validated using data from 159 debris flow-prone gullies, integrating deep convolutional, recurrent, and attention-based architectures, with hyperparameters autonomously optimized by IKOA. …”
-
76
Robustness Analysis of each model.
Published 2025“…The model is developed and validated using data from 159 debris flow-prone gullies, integrating deep convolutional, recurrent, and attention-based architectures, with hyperparameters autonomously optimized by IKOA. …”
-
77
Error of ICESat-2 with respect to airborne data.
Published 2024“…In the hybrid model of this paper, the choice was made to use the Densenet architecture of CNN models with LightGBM as the primary model. …”
-
78
-
79
The workflow of the proposed model.
Published 2024“…Next, we employ batch normalization to smooth and enhance the collected data, followed by feature extraction using the AlexNet model. …”
-
80