Showing 121 - 140 results of 160 for search '(( binary data based optimization algorithm ) OR ( library based process optimization algorithm ))*', query time: 0.53s Refine Results
  1. 121

    Presentation_1_Modified GAN Augmentation Algorithms for the MRI-Classification of Myocardial Scar Tissue in Ischemic Cardiomyopathy.PPTX by Umesh C. Sharma (10785063)

    Published 2021
    “…Currently, there are no optimized deep-learning algorithms for the automated classification of scarred vs. normal myocardium. …”
  2. 122

    Flowchart scheme of the ML-based model. by Noshaba Qasmi (20405009)

    Published 2024
    “…<b>J)</b> Optimization of hyperparameter tuning. <b>K)</b> Algorithm selection from all models. …”
  3. 123
  4. 124

    Algoritmo de clasificación de expresiones de odio por tipos en español (Algorithm for classifying hate expressions by type in Spanish) by Daniel Pérez Palau (11097348)

    Published 2024
    “…</p><h2>Model Architecture</h2><p dir="ltr">The model is based on <code>pysentimiento/robertuito-base-uncased</code> with the following modifications:</p><ul><li>A dense classification layer was added over the base model</li><li>Uses input IDs and attention masks as inputs</li><li>Generates a multi-class classification with 5 hate categories</li></ul><h2>Dataset</h2><p dir="ltr"><b>HATEMEDIA Dataset</b>: Custom hate speech dataset with categorization by type:</p><ul><li><b>Labels</b>: 5 hate type categories (0-4)</li><li><b>Preprocessing</b>:</li><li>Null values ​​removed from text and labels</li><li>Reindexing and relabeling (original labels are adjusted by subtracting 1)</li><li>Exclusion of category 2 during training</li><li>Conversion of category 5 to category 2</li></ul><h2>Training Process</h2><h3>Configuration</h3><ul><li><b>Batch size</b>: 128</li><li><b>Epoches</b>: 5</li><li><b>Learning rate</b>: 2e-5 with 10% warmup steps</li><li><b>Early stopping</b> with patience=2</li><li><b>Class weights</b>: Balanced to handle class imbalance</li></ul><h3>Custom Metrics</h3><ul><li>Recall for specific classes (focus on class 2)</li><li>Precision for specific classes (focus on class 3)</li><li>F1-score (weighted)</li><li>AUC-PR</li><li>Recall at precision=0.6 (class 3)</li><li>Precision at recall=0.6 (class 2)</li></ul><h2>Evaluation Metrics</h2><p dir="ltr">The model is evaluated using:</p><ul><li>Macro recall, precision, and F1-score</li><li>One-vs-Rest AUC</li><li>Accuracy</li><li>Per-class metrics</li><li>Confusion matrix</li><li>Full classification report</li></ul><h2>Technical Features</h2><h3>Data Preprocessing</h3><ul><li><b>Tokenization</b>: Maximum length of 128 tokens (truncation and padding)</li><li><b>Encoding of labels</b>: One-hot encoding for multi-class classification</li><li><b>Data split</b>: 80% training, 10% validation, 10% testing</li></ul><h3>Optimization</h3><ul><li><b>Optimizer</b>: Adam with linear warmup scheduling</li><li><b>Loss function</b>: Categorical Crossentropy (from_logits=True)</li><li><b>Imbalance handling</b>: Class weights computed automatically</li></ul><h2>Requirements</h2><p dir="ltr">The following Python packages are required:</p><ul><li>TensorFlow</li><li>Transformers</li><li>scikit-learn</li><li>pandas</li><li>datasets</li><li>matplotlib</li><li>seaborn</li><li>numpy</li></ul><h2>Usage</h2><ol><li><b>Data format</b>:</li></ol><ul><li>CSV file or Pandas DataFrame</li><li>Required column name: <code>text</code> (string type)</li><li>Required column name: Data type label (integer type, 0-4) - optional for evaluation</li></ul><ol><li><b>Text preprocessing</b>:</li></ol><ul><li>Automatic tokenization with a maximum length of 128 tokens</li><li>Long texts will be automatically truncated</li><li>Handling of special characters, URLs, and emojis included</li></ul><ol><li><b>Label encoding</b>:</li></ol><ul><li>The model classifies hate speech into 5 categories (0-4)</li><li><code>0</code>: Political hatred: Expressions directed against individuals or groups based on political orientation.…”
  5. 125
  6. 126
  7. 127
  8. 128

    GSE96058 information. by Sepideh Zununi Vahed (9861298)

    Published 2024
    “…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
  9. 129

    The performance of classifiers. by Sepideh Zununi Vahed (9861298)

    Published 2024
    “…Subsequently, feature selection was conducted using ANOVA and binary Particle Swarm Optimization (PSO). During the analysis phase, the discriminative power of the selected features was evaluated using machine learning classification algorithms. …”
  10. 130

    Data_Sheet_1_CLGBO: An Algorithm for Constructing Highly Robust Coding Sets for DNA Storage.docx by Yanfen Zheng (3814507)

    Published 2021
    “…In this study, we describe an enhanced gradient-based optimizer that includes the Cauchy and Levy mutation strategy (CLGBO) to construct DNA coding sets, which are used as primer and address libraries. …”
  11. 131
  12. 132

    Presentation_1_Optimization of the k-nearest-neighbors model for summer Arctic Sea ice prediction.pdf by Yongcheng Lin (776525)

    Published 2023
    “…In this study, we utilized a sea ice concentration dataset obtained from satellite remote sensing and applied the k-nearest-neighbors (Ice-kNN) machine learning model to forecast the summer Arctic sea ice concentration and extent on 122 days prediction. Based on the physical characteristics of summer sea ice, different algorithms are employed to optimize the prediction model. …”
  13. 133

    Presentation_1_Optimization of the k-nearest-neighbors model for summer Arctic Sea ice prediction.pdf by Yongcheng Lin (776525)

    Published 2023
    “…In this study, we utilized a sea ice concentration dataset obtained from satellite remote sensing and applied the k-nearest-neighbors (Ice-kNN) machine learning model to forecast the summer Arctic sea ice concentration and extent on 122 days prediction. Based on the physical characteristics of summer sea ice, different algorithms are employed to optimize the prediction model. …”
  14. 134
  15. 135

    Table_1_An efficient decision support system for leukemia identification utilizing nature-inspired deep feature optimization.pdf by Muhammad Awais (263096)

    Published 2024
    “…Next, a hybrid feature extraction approach is presented leveraging transfer learning from selected deep neural network models, InceptionV3 and DenseNet201, to extract comprehensive feature sets. To optimize feature selection, a customized binary Grey Wolf Algorithm is utilized, achieving an impressive 80% reduction in feature size while preserving key discriminative information. …”
  16. 136
  17. 137
  18. 138
  19. 139

    Contextual Dynamic Pricing with Strategic Buyers by Pangpang Liu (18886419)

    Published 2024
    “…This underscores the rate optimality of our policy. Importantly, our policy is not a mere amalgamation of existing dynamic pricing policies and strategic behavior handling algorithms. …”
  20. 140

    Supplementary file 1_Comparative evaluation of fast-learning classification algorithms for urban forest tree species identification using EO-1 hyperion hyperspectral imagery.docx by Veera Narayana Balabathina (22518524)

    Published 2025
    “…</p>Methods<p>Thirteen supervised classification algorithms were comparatively evaluated, encompassing traditional spectral/statistical classifiers—Maximum Likelihood, Mahalanobis Distance, Minimum Distance, Parallelepiped, Spectral Angle Mapper (SAM), Spectral Information Divergence (SID), and Binary Encoding—and machine learning algorithms including Decision Tree (DT), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN). …”