يعرض 61 - 80 نتائج من 151 نتيجة بحث عن '(( binary a robust optimization algorithm ) OR ( binary based models optimization algorithm ))*', وقت الاستعلام: 0.53s تنقيح النتائج
  1. 61

    Results of Decision tree. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  2. 62

    Adaboost classifier results. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  3. 63

    Results of Lightbgm. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  4. 64

    Results of Lightbgm. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  5. 65

    Feature selection process. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  6. 66

    Results of KNN. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  7. 67

    After upsampling. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  8. 68

    Results of Extra tree. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  9. 69

    Gradient boosting classifier results. حسب Balraj Preet Kaur (20370832)

    منشور في 2024
    "…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …"
  10. 70

    Pseudo Code of RBMO. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  11. 71

    P-value on CEC-2017(Dim = 30). حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  12. 72

    Memory storage behavior. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  13. 73

    Elite search behavior. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  14. 74

    Description of the datasets. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  15. 75

    S and V shaped transfer functions. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  16. 76

    S- and V-Type transfer function diagrams. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  17. 77

    Collaborative hunting behavior. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  18. 78

    Friedman average rank sum test results. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  19. 79

    IRBMO vs. variant comparison adaptation data. حسب Chenyi Zhu (9383370)

    منشور في 2025
    "…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. …"
  20. 80

    <i>hi</i>PRS algorithm process flow. حسب Michela C. Massi (14599915)

    منشور في 2023
    "…From this dataset we can compute the MI between each interaction and the outcome and <b>(D)</b> obtain a ranked list (<i>I</i><sub><i>δ</i></sub>) based on this metric. <b>(E)</b> Starting from the interaction at the top of <i>I</i><sub><i>δ</i></sub>, <i>hi</i>PRS constructs <i>I</i><sub><i>K</i></sub>, selecting <i>K</i> (where <i>K</i> is user-specified) terms through the greedy optimization of the ratio between MI (<i>relevance</i>) and a suitable measure of similarity for interactions (<i>redundancy)</i> (cf. …"