Search alternatives:
guided optimization » model optimization (Expand Search)
based optimization » whale optimization (Expand Search)
sample based » samples based (Expand Search), scale based (Expand Search)
binary task » binary mask (Expand Search)
Showing 1 - 20 results of 188 for search '(( binary task guided optimization algorithm ) OR ( total sample based optimization algorithm ))*', query time: 0.53s Refine Results
  1. 1
  2. 2
  3. 3

    The Pseudo-Code of the IRBMO Algorithm. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  4. 4

    IRBMO vs. meta-heuristic algorithms boxplot. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  5. 5

    IRBMO vs. feature selection algorithm boxplot. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  6. 6
  7. 7

    Plots of steady-state frequency control. by Salisu Mohammed (772274)

    Published 2023
    “…The optimization problem was formulated based on the network power flow and the discrete-time sampling of the constrained control parameters. …”
  8. 8

    Plots of steady-state voltage control. by Salisu Mohammed (772274)

    Published 2023
    “…The optimization problem was formulated based on the network power flow and the discrete-time sampling of the constrained control parameters. …”
  9. 9

    Plots of steady-state input trajectory. by Salisu Mohammed (772274)

    Published 2023
    “…The optimization problem was formulated based on the network power flow and the discrete-time sampling of the constrained control parameters. …”
  10. 10

    Two-dimensional benchmark test-functions. by Salisu Mohammed (772274)

    Published 2023
    “…The optimization problem was formulated based on the network power flow and the discrete-time sampling of the constrained control parameters. …”
  11. 11

    Block diagram of autonomous microgrid. by Salisu Mohammed (772274)

    Published 2023
    “…The optimization problem was formulated based on the network power flow and the discrete-time sampling of the constrained control parameters. …”
  12. 12

    Thirty-dimensional benchmark test-functions. by Salisu Mohammed (772274)

    Published 2023
    “…The optimization problem was formulated based on the network power flow and the discrete-time sampling of the constrained control parameters. …”
  13. 13
  14. 14

    Iteration diagram of genetic algorithm. by Ke Peng (2220973)

    Published 2023
    “…The results show that: (1) The applied SMOTEENN is more effective than SMOTE and ADASYN in dealing with the imbalance of banking data. (2) The F1 and AUC values of the model improved and optimized by XGBoost using genetic algorithm can reach 90% and 99%, respectively, which are optimal compared to other six machine learning models. …”
  15. 15

    Genetic algorithm flow chart. by Ke Peng (2220973)

    Published 2023
    “…The results show that: (1) The applied SMOTEENN is more effective than SMOTE and ADASYN in dealing with the imbalance of banking data. (2) The F1 and AUC values of the model improved and optimized by XGBoost using genetic algorithm can reach 90% and 99%, respectively, which are optimal compared to other six machine learning models. …”
  16. 16
  17. 17

    Results of genetic algorithm tuning parameters. by Ke Peng (2220973)

    Published 2023
    “…The results show that: (1) The applied SMOTEENN is more effective than SMOTE and ADASYN in dealing with the imbalance of banking data. (2) The F1 and AUC values of the model improved and optimized by XGBoost using genetic algorithm can reach 90% and 99%, respectively, which are optimal compared to other six machine learning models. …”
  18. 18
  19. 19
  20. 20