Showing 1 - 20 results of 23 for search '(( binary task guided optimization algorithm ) OR ( binary samples when optimization algorithm ))*', query time: 0.54s Refine Results
  1. 1

    The Pseudo-Code of the IRBMO Algorithm. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  2. 2

    IRBMO vs. meta-heuristic algorithms boxplot. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  3. 3

    IRBMO vs. feature selection algorithm boxplot. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  4. 4

    ROC curve for binary classification. by Nicodemus Songose Awarayi (18414494)

    Published 2024
    “…The proposed model yielded notable results, such as an accuracy of 93.45% and an area under the curve value of 0.99 when trained on the three classes. The model further showed superior results on binary classification compared with existing methods. …”
  5. 5

    Confusion matrix for binary classification. by Nicodemus Songose Awarayi (18414494)

    Published 2024
    “…The proposed model yielded notable results, such as an accuracy of 93.45% and an area under the curve value of 0.99 when trained on the three classes. The model further showed superior results on binary classification compared with existing methods. …”
  6. 6
  7. 7

    Pseudo Code of RBMO. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  8. 8

    P-value on CEC-2017(Dim = 30). by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  9. 9

    Memory storage behavior. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  10. 10

    Elite search behavior. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  11. 11

    Description of the datasets. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  12. 12

    S and V shaped transfer functions. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  13. 13

    S- and V-Type transfer function diagrams. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  14. 14

    Collaborative hunting behavior. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  15. 15

    Friedman average rank sum test results. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  16. 16

    IRBMO vs. variant comparison adaptation data. by Chenyi Zhu (9383370)

    Published 2025
    “…To adapt to the feature selection problem, we convert the continuous optimization algorithm to binary form via transfer function, which further enhances the applicability of the algorithm. …”
  17. 17

    Thesis-RAMIS-Figs_Slides by Felipe Santibañez-Leal (10967991)

    Published 2024
    “…<br><br>Finally, although the developed concepts, ideas and algorithms have been developed for inverse problems in geostatistics, the results are applicable to a wide range of disciplines where similar sampling problems need to be faced, included but not limited to design of communication networks, optimal integration and communication of swarms of robots and drones, remote sensing.…”
  18. 18

    Testing results for classifying AD, MCI and NC. by Nicodemus Songose Awarayi (18414494)

    Published 2024
    “…The proposed model yielded notable results, such as an accuracy of 93.45% and an area under the curve value of 0.99 when trained on the three classes. The model further showed superior results on binary classification compared with existing methods. …”
  19. 19

    Summary of existing CNN models. by Nicodemus Songose Awarayi (18414494)

    Published 2024
    “…The proposed model yielded notable results, such as an accuracy of 93.45% and an area under the curve value of 0.99 when trained on the three classes. The model further showed superior results on binary classification compared with existing methods. …”
  20. 20

    Supplementary file 1_Comparative evaluation of fast-learning classification algorithms for urban forest tree species identification using EO-1 hyperion hyperspectral imagery.docx by Veera Narayana Balabathina (22518524)

    Published 2025
    “…</p>Methods<p>Thirteen supervised classification algorithms were comparatively evaluated, encompassing traditional spectral/statistical classifiers—Maximum Likelihood, Mahalanobis Distance, Minimum Distance, Parallelepiped, Spectral Angle Mapper (SAM), Spectral Information Divergence (SID), and Binary Encoding—and machine learning algorithms including Decision Tree (DT), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN). …”