Showing 21 - 29 results of 29 for search '(( binary data robust classification algorithm ) OR ( binary data codon optimization algorithm ))*', query time: 0.44s Refine Results
  1. 21

    The architecture of the BI-LSTM model. by Arshad Hashmi (13835488)

    Published 2024
    “…The model’s binary and multi-class classification accuracies on the UNSW-NB15 dataset are 99.56% and 99.45%, respectively. …”
  2. 22

    Comparison of accuracy and DR on UNSW-NB15. by Arshad Hashmi (13835488)

    Published 2024
    “…The model’s binary and multi-class classification accuracies on the UNSW-NB15 dataset are 99.56% and 99.45%, respectively. …”
  3. 23

    Comparison of DR and FPR of UNSW-NB15. by Arshad Hashmi (13835488)

    Published 2024
    “…The model’s binary and multi-class classification accuracies on the UNSW-NB15 dataset are 99.56% and 99.45%, respectively. …”
  4. 24

    Table_1_Near infrared spectroscopy for cooking time classification of cassava genotypes.docx by Massaine Bandeira e Sousa (7866242)

    Published 2024
    “…Two NIRs devices, the portable QualitySpec® Trek (QST) and the benchtop NIRFlex N-500 were used to collect spectral data. Classification of genotypes was carried out using the K-nearest neighbor algorithm (KNN) and partial least squares (PLS) models. …”
  5. 25
  6. 26

    Models and Dataset by M RN (9866504)

    Published 2025
    “…Its simplicity and lack of algorithm-specific parameters make it computationally efficient and easy to apply in high-dimensional problems such as gene selection for cancer classification.…”
  7. 27

    Supplementary Material 8 by Nishitha R Kumar (19750617)

    Published 2025
    “…</li><li><b>Radial basis function kernel-support vector machine (RBF-SVM): </b>A more flexible version of SVM that uses a non-linear kernel to capture complex relationships in genomic data, improving classification accuracy.</li><li><b>Extra trees classifier: </b>This tree-based ensemble method enhances classification by randomly selecting features and thresholds, improving robustness in <i>E. coli</i> strain differentiation.…”
  8. 28

    iNCog-EEG (ideal vs. Noisy Cognitive EEG for Workload Assessment) Dataset by Fariya Bintay Shafi (21692408)

    Published 2025
    “…Inside each folder, four <b>.EDF</b> files represent the workload conditions:</p><pre><pre>subxx_nw.EDF → No Workload (resting state) <br>subxx_lw.EDF → Low Workload (easy multitasking) <br>subxx_mw.EDF → Moderate Workload (medium multitasking) <br>subxx_hw.EDF → High Workload (hard multitasking) <br></pre></pre><ul><li><b>Subjects 01–30:</b> Clean EEG recordings</li><li><b>Subjects 31–40:</b> Noisy EEG recordings with real-world artifacts</li></ul><p dir="ltr">This structure ensures straightforward differentiation between clean vs. noisy data and across workload levels.</p><h3>Applications</h3><p dir="ltr">This dataset can be applied to a wide range of research areas, including:</p><ul><li>EEG signal denoising and artifact rejection</li><li>Binary and hierarchical <b>cognitive workload classification</b></li><li>Development of <b>robust Brain–Computer Interfaces (BCIs)</b></li><li>Benchmarking algorithms under <b>ideal and noisy conditions</b></li><li>Multitasking and mental workload assessment in <b>real-world scenarios</b></li></ul><p dir="ltr">By combining controlled multitasking protocols with deliberately introduced environmental noise, <b>iNCog-EEG provides a comprehensive benchmark</b> for advancing EEG-based workload recognition systems in both clean and challenging conditions.…”
  9. 29

    Image_1_Validation of miRNA signatures for ovarian cancer earlier detection in the pre-diagnosis setting using machine learning approaches.pdf by Konrad Stawiski (4753380)

    Published 2024
    “…We employed the extreme gradient boosting (XGBoost) algorithm to train a binary classification model using 70% of the available data, while the model was tested on the remaining 30% of the dataset.…”