Search alternatives:
processing optimization » process optimization (Expand Search), process optimisation (Expand Search), routing optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
sample processing » image processing (Expand Search), waste processing (Expand Search), pre processing (Expand Search)
data sample » data samples (Expand Search)
binary b » binary _ (Expand Search)
b codon » _ codon (Expand Search), b common (Expand Search)
processing optimization » process optimization (Expand Search), process optimisation (Expand Search), routing optimization (Expand Search)
codon optimization » wolf optimization (Expand Search)
sample processing » image processing (Expand Search), waste processing (Expand Search), pre processing (Expand Search)
data sample » data samples (Expand Search)
binary b » binary _ (Expand Search)
b codon » _ codon (Expand Search), b common (Expand Search)
-
141
Stacking schematic diagram.
Published 2024“…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
-
142
Flowchart of stacking model integration.
Published 2024“…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
-
143
Prediction results of the stacking model.
Published 2024“…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
-
144
Optimal Subsampling for Functional Quasi-Mode Regression with Big Data
Published 2024“…These optimal probabilities rely on the full data estimate, prompting the development of a two-step algorithm to approximate the optimal subsampling procedure. …”
-
145
-
146
-
147
-
148
-
149
Table2_A Gray Wolf Optimization-Based Improved Probabilistic Neural Network Algorithm for Surrounding Rock Squeezing Classification in Tunnel Engineering.DOCX
Published 2022“…The spread coefficient was the critical hyper-parameter in the PNN, and the improved gray wolf optimization (IGWO) algorithm was used to realize its efficient automatic optimization. …”
-
150
Table1_A Gray Wolf Optimization-Based Improved Probabilistic Neural Network Algorithm for Surrounding Rock Squeezing Classification in Tunnel Engineering.DOCX
Published 2022“…The spread coefficient was the critical hyper-parameter in the PNN, and the improved gray wolf optimization (IGWO) algorithm was used to realize its efficient automatic optimization. …”
-
151
-
152
-
153
-
154
Overall flowchart of the proposed model.
Published 2025“…BOHB merges Bayesian optimization and Hyperband, significantly speeding up the optimization process. …”
-
155
-
156
-
157
Similarity indices between relative web search popularity and Covid-19 time-series.
Published 2023Subjects: -
158
Web-search relative popularity trends as a proxy of public interest for Covid-19.
Published 2023Subjects: -
159
Data-Quality-Navigated Machine Learning Strategy with Chemical Intuition to Improve Generalization
Published 2024“…Most ML works focused on improvements in algorithms and feature representations. However, the data quality, as the foundation of ML, has been largely overlooked, also leading to the absence of data evaluation and processing methods in ML fields. …”
-
160
ANFIS MODELING IN PROJECTION WELDING OF NUTS TO SHEETS
Published 2022“…After all good agreements and validations, Genetic Algorithm (GA) was used to obtain some intermediate values not found on the actual experimental data. …”