Showing 1 - 20 results of 38 for search '(( binary based guided optimization algorithm ) OR ( binary based quality optimization algorithm ))', query time: 1.90s Refine Results
  1. 1
  2. 2
  3. 3

    QSAR model for predicting neuraminidase inhibitors of influenza A viruses (H1N1) based on adaptive grasshopper optimization algorithm by Z.Y. Algamal (5547620)

    Published 2020
    “…Obtaining a reliable QSAR model with few descriptors is an essential procedure in chemometrics. The binary grasshopper optimization algorithm (BGOA) is a new meta-heuristic optimization algorithm, which has been used successfully to perform feature selection. …”
  4. 4

    ROC curve for binary classification. by Nicodemus Songose Awarayi (18414494)

    Published 2024
    “…The study introduced a scheme for enhancing images to improve the quality of the datasets. Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
  5. 5

    Confusion matrix for binary classification. by Nicodemus Songose Awarayi (18414494)

    Published 2024
    “…The study introduced a scheme for enhancing images to improve the quality of the datasets. Specifically, an image enhancement algorithm based on histogram equalization and bilateral filtering techniques was deployed to reduce noise and enhance the quality of the images. …”
  6. 6

    The Pseudo-Code of the IRBMO Algorithm. by Chenyi Zhu (9383370)

    Published 2025
    “…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
  7. 7

    Datasets and their properties. by Olaide N. Oyelade (14047002)

    Published 2023
    “…However, the underlying deficiency of the single binary optimizer is transferred to the quality of the features selected. …”
  8. 8

    Parameter settings. by Olaide N. Oyelade (14047002)

    Published 2023
    “…However, the underlying deficiency of the single binary optimizer is transferred to the quality of the features selected. …”
  9. 9

    IRBMO vs. meta-heuristic algorithms boxplot. by Chenyi Zhu (9383370)

    Published 2025
    “…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
  10. 10

    IRBMO vs. feature selection algorithm boxplot. by Chenyi Zhu (9383370)

    Published 2025
    “…In order to comprehensively verify the performance of IRBMO, this paper designs a series of experiments to compare it with nine mainstream binary optimization algorithms. The experiments are based on 12 medical datasets, and the results show that IRBMO achieves optimal overall performance in key metrics such as fitness value, classification accuracy and specificity. …”
  11. 11

    Identification and quantitation of clinically relevant microbes in patient samples: Comparison of three k-mer based classifiers for speed, accuracy, and sensitivity by George S. Watts (7962206)

    Published 2019
    “…We tested the accuracy, sensitivity, and resource requirements of three top metagenomic taxonomic classifiers that use fast k-mer based algorithms: Centrifuge, CLARK, and KrakenUniq. …”
  12. 12

    SHAP bar plot. by Meng Cao (105914)

    Published 2025
    “…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
  13. 13

    Sample screening flowchart. by Meng Cao (105914)

    Published 2025
    “…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
  14. 14

    Descriptive statistics for variables. by Meng Cao (105914)

    Published 2025
    “…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
  15. 15

    SHAP summary plot. by Meng Cao (105914)

    Published 2025
    “…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
  16. 16

    ROC curves for the test set of four models. by Meng Cao (105914)

    Published 2025
    “…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
  17. 17

    Display of the web prediction interface. by Meng Cao (105914)

    Published 2025
    “…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
  18. 18
  19. 19

    Flowchart scheme of the ML-based model. by Noshaba Qasmi (20405009)

    Published 2024
    “…<b>I)</b> Testing data consisting of 20% of the entire dataset. <b>J)</b> Optimization of hyperparameter tuning. <b>K)</b> Algorithm selection from all models. …”
  20. 20