Showing 61 - 75 results of 75 for search '(( final batch processing optimization algorithm ) OR ( binary b model optimization algorithm ))*', query time: 0.89s Refine Results
  1. 61

    Environmental DNA metabarcoding to monitor tropical reef fishes in Santa Marta by camille albouy (3800893)

    Published 2021
    “…For example, with our selected threshold of 0.001, if a sequence has a total read count of 10,000 at the P1_A1 plate position of the library A, all detections of this sequence below 10 reads (10’000 * 0.001 = 10) are discarded at the plate position P1_A1 for the library B if library A and B belong to the same sequencing batch.</p><p><b>Swarm clustering: </b>We applied a second bioinformatics workflow, the clustering algorithm SWARM, which uses sequence similarity and abundance patterns to cluster multiple variants of sequences into MOTU (Molecular Operational Taxonomic Units; Mahé et al., 2014; Rognes et al., 2016). …”
  2. 62

    Datasheet1_A Workflow for Rapid Unbiased Quantification of Fibrillar Feature Alignment in Biological Images.zip by Stefania Marcotti (5896853)

    Published 2021
    “…Implementation is made available in both MATLAB and Python for wider accessibility, with example datasets for single images and batch processing. Additionally, we include an approach to automatically search parameters for optimum window and neighborhood sizes, as well as to measure the decay in alignment over progressively increasing length scales.…”
  3. 63

    Aluminum alloy industrial materials defect by Ying Han (20349093)

    Published 2024
    “…</p><p dir="ltr">The settings for other parameters are as follows. epochs: 200,patience: 50,batch: 16,imgsz: 640,pretrained: true,optimizer: SGD,close_mosaic: 10,iou: 0.7,momentum: 0.937,weight_decay: 0.0005,box: 7.5,cls: 0.5,dfl: 1.5,pose: 12.0,kobj: 1.0,save_dir: runs/train</p><p dir="ltr">The <b>defeat_dataset.…”
  4. 64

    Table 1_Heavy metal biomarkers and their impact on hearing loss risk: a machine learning framework analysis.docx by Ali Nabavi (21097424)

    Published 2025
    “…Multiple machine learning algorithms, including Random Forest, XGBoost, Gradient Boosting, Logistic Regression, CatBoost, and MLP, were optimized and evaluated. …”
  5. 65

    Code by Baoqiang Chen (21099509)

    Published 2025
    “…After the convolution and pooling operations, batch normalization is applied to stabilize and accelerate the training process.…”
  6. 66

    Core data by Baoqiang Chen (21099509)

    Published 2025
    “…After the convolution and pooling operations, batch normalization is applied to stabilize and accelerate the training process.…”
  7. 67

    Machine Learning-Ready Dataset for Cytotoxicity Prediction of Metal Oxide Nanoparticles by Soham Savarkar (21811825)

    Published 2025
    “…</p><p dir="ltr">Encoding: Categorical variables such as surface coating and cell type were grouped into logical classes and label-encoded to enable model compatibility.</p><p dir="ltr"><b>Applications and Model Compatibility:</b></p><p dir="ltr">The dataset is optimized for use in supervised learning workflows and has been tested with algorithms such as:</p><p dir="ltr">Gradient Boosting Machines (GBM),</p><p dir="ltr">Support Vector Machines (SVM-RBF),</p><p dir="ltr">Random Forests, and</p><p dir="ltr">Principal Component Analysis (PCA) for feature reduction.…”
  8. 68

    Impact of Q on cost analysis diagram. by Guanghui Chen (316677)

    Published 2025
    “…Finally, an empirical example was facilitated by examining real shared bikes stations in the Yanta district of Xi’an, China, to verify the effectiveness of the model and algorithm. …”
  9. 69

    Stations spatial distribution diagram. by Guanghui Chen (316677)

    Published 2025
    “…Finally, an empirical example was facilitated by examining real shared bikes stations in the Yanta district of Xi’an, China, to verify the effectiveness of the model and algorithm. …”
  10. 70

    shared bikes distribution vehicle routes diagram. by Guanghui Chen (316677)

    Published 2025
    “…Finally, an empirical example was facilitated by examining real shared bikes stations in the Yanta district of Xi’an, China, to verify the effectiveness of the model and algorithm. …”
  11. 71

    Stations location and demand. by Guanghui Chen (316677)

    Published 2025
    “…Finally, an empirical example was facilitated by examining real shared bikes stations in the Yanta district of Xi’an, China, to verify the effectiveness of the model and algorithm. …”
  12. 72

    Table3_Identifying In Vitro Cultured Human Hepatocytes Markers with Machine Learning Methods Based on Single-Cell RNA-Seq Data.XLSX by ZhanDong Li (11653330)

    Published 2022
    “…Then, several classifiers were trained and evaluated to obtain optimal classifiers and optimal feature subsets, using three classification algorithms (random forest, k-nearest neighbor, and decision tree) and the incremental feature selection method. …”
  13. 73

    Table4_Identifying In Vitro Cultured Human Hepatocytes Markers with Machine Learning Methods Based on Single-Cell RNA-Seq Data.XLSX by ZhanDong Li (11653330)

    Published 2022
    “…Then, several classifiers were trained and evaluated to obtain optimal classifiers and optimal feature subsets, using three classification algorithms (random forest, k-nearest neighbor, and decision tree) and the incremental feature selection method. …”
  14. 74

    Table1_Identifying In Vitro Cultured Human Hepatocytes Markers with Machine Learning Methods Based on Single-Cell RNA-Seq Data.XLSX by ZhanDong Li (11653330)

    Published 2022
    “…Then, several classifiers were trained and evaluated to obtain optimal classifiers and optimal feature subsets, using three classification algorithms (random forest, k-nearest neighbor, and decision tree) and the incremental feature selection method. …”
  15. 75

    Table2_Identifying In Vitro Cultured Human Hepatocytes Markers with Machine Learning Methods Based on Single-Cell RNA-Seq Data.XLSX by ZhanDong Li (11653330)

    Published 2022
    “…Then, several classifiers were trained and evaluated to obtain optimal classifiers and optimal feature subsets, using three classification algorithms (random forest, k-nearest neighbor, and decision tree) and the incremental feature selection method. …”