Showing 121 - 140 results of 453 for search '(( data sample processing optimization algorithm ) OR ( binary d while optimization algorithm ))', query time: 0.52s Refine Results
  1. 121

    Data_Sheet_1_Posiform planting: generating QUBO instances for benchmarking.pdf by Georg Hahn (12530469)

    Published 2023
    “…<p>We are interested in benchmarking both quantum annealing and classical algorithms for minimizing quadratic unconstrained binary optimization (QUBO) problems. …”
  2. 122
  3. 123
  4. 124

    Modeling CO<sub>2</sub> solubility in polyethylene glycol polymer using data driven methods by YunLi Lei (21458056)

    Published 2025
    “…In this research, a Random Forest (RF) machine learning model is meticulously tuned through four sophisticated optimization algorithms: Batch Bayesian Optimization (BBO), Self-Adaptive Differential Evolution (SADE), Bayesian Probability Improvement (BPI), and Gaussian Processes Optimization (GPO). …”
  5. 125

    Paeameter ranges and optimal values. by Zhen Zhao (159931)

    Published 2025
    “…Additionally, considering the imbalanced in population spatial distribution, we used the K-means ++ clustering algorithm to cluster the optimal feature subset, and we used the bootstrap sampling method to extract the same amount of data from each cluster and fuse it with the training subset to build an improved random forest model. …”
  6. 126

    Thesis-RAMIS-Figs_Slides by Felipe Santibañez-Leal (10967991)

    Published 2024
    “…In addition, the practical benefits for \emph{<i>MPS</i>} in the context of simulating channelized facies models is demonstrated using synthetic data and real geological facies. Importantly, this strategy locates samples adaptively on the transition between facies which improves the performance of conventional \emph{<i>MPS</i>} algorithms. …”
  7. 127
  8. 128
  9. 129

    MLP vs classification algorithms. by Mohd Mustaqeem (19106494)

    Published 2024
    “…We propose SPAM-XAI, a hybrid model integrating novel sampling, feature selection, and eXplainable-AI (XAI) algorithms to address these challenges. …”
  10. 130
  11. 131

    Multimodal Mass Spectrometry Imaging of Rat Brain Using IR-MALDESI and NanoPOTS-LC-MS/MS by Crystal L. Pace (9105558)

    Published 2021
    “…The aim of this work was to create a multimodal MSI approach that measures metabolomic and proteomic data from a single biological organ by combining infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) for metabolomic MSI and nanodroplet processing in one pot for trace samples (nanoPOTS) LC-MS/MS for spatially resolved proteome profiling. …”
  12. 132

    Multimodal Mass Spectrometry Imaging of Rat Brain Using IR-MALDESI and NanoPOTS-LC-MS/MS by Crystal L. Pace (9105558)

    Published 2021
    “…The aim of this work was to create a multimodal MSI approach that measures metabolomic and proteomic data from a single biological organ by combining infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) for metabolomic MSI and nanodroplet processing in one pot for trace samples (nanoPOTS) LC-MS/MS for spatially resolved proteome profiling. …”
  13. 133

    REDUCTION OF SAMPLE SIZE IN THE ANALYSIS OF SPATIAL VARIABILITY OF NONSTATIONARY SOIL CHEMICAL ATTRIBUTES by Tamara C. Maltauro (7366898)

    Published 2019
    “…<div><p>ABSTRACT In the study of spatial variability of soil attributes, it is essential to define a sampling plan with adequate sample size. This study aimed to evaluate, through simulated data, the influence of parameters of the geostatistical model and sampling configuration on the optimization process, and resize and reduce the sample size of a sampling configuration of a commercial area composed of 102 points. …”
  14. 134
  15. 135

    Technical approach. by Wenguang Li (6528113)

    Published 2024
    “…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
  16. 136

    Pearson correlation coefficient matrix plot. by Wenguang Li (6528113)

    Published 2024
    “…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
  17. 137

    SHAP of stacking. by Wenguang Li (6528113)

    Published 2024
    “…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
  18. 138

    Stacking ROC curve chart. by Wenguang Li (6528113)

    Published 2024
    “…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
  19. 139

    Confusion matrix. by Wenguang Li (6528113)

    Published 2024
    “…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”
  20. 140

    GA-XGBoost feature importances. by Wenguang Li (6528113)

    Published 2024
    “…The results show: (1) Random oversampling, ADASYN, SMOTE, and SMOTEENN were used for data balance processing, among which SMOTEENN showed better efficiency and effect in dealing with data imbalance. (2) The GA-XGBoost model optimized the hyperparameters of the XGBoost model through a genetic algorithm to improve the model’s predictive accuracy. …”