Search alternatives:
network optimization » swarm optimization (Expand Search), wolf optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
network optimization » swarm optimization (Expand Search), wolf optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
-
101
Models and Dataset
Published 2025“…</p><p dir="ltr"><br></p><p dir="ltr"><b>TJO (Tom and Jerry Optimization):</b><br>TJO is a nature-inspired metaheuristic algorithm that models the predator-prey dynamics of the cartoon characters Tom (predator) and Jerry (prey). …”
-
102
Algoritmo de clasificación de expresiones de odio por tipos en español (Algorithm for classifying hate expressions by type in Spanish)
Published 2024“…</li></ul><p dir="ltr"><b>File Structure</b></p><p dir="ltr">The code generates and saves:</p><ul><li>Weights of the trained model (.h5)</li><li>Configured tokenizer</li><li>Training history in CSV</li><li>Requirements file</li></ul><p dir="ltr"><b>Important Notes</b></p><ul><li>The model excludes category 2 during training</li><li>Implements transfer learning from a pre-trained model for binary hate detection</li><li>Includes early stopping callbacks to prevent overfitting</li><li>Uses class weighting to handle category imbalances</li></ul><p dir="ltr">The process of creating this algorithm is explained in the technical report located at: Blanco-Valencia, X., De Gregorio-Vicente, O., Ruiz Iniesta, A., & Said-Hung, E. (2025). …”
-
103
Supplementary file 1_Comparative evaluation of fast-learning classification algorithms for urban forest tree species identification using EO-1 hyperion hyperspectral imagery.docx
Published 2025“…</p>Methods<p>Thirteen supervised classification algorithms were comparatively evaluated, encompassing traditional spectral/statistical classifiers—Maximum Likelihood, Mahalanobis Distance, Minimum Distance, Parallelepiped, Spectral Angle Mapper (SAM), Spectral Information Divergence (SID), and Binary Encoding—and machine learning algorithms including Decision Tree (DT), K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN). …”
-
104
Processed dataset to train and test the WGAN-GP_IMOA_DA_Ensemble model
Published 2025“…This framework integrates a novel biologically inspired optimization algorithm, the Indian Millipede Optimization Algorithm (IMOA), for effective feature selection. …”
-
105
-
106
Data_Sheet_1_Alzheimer’s Disease Diagnosis and Biomarker Analysis Using Resting-State Functional MRI Functional Brain Network With Multi-Measures Features and Hippocampal Subfield...
Published 2022“…Finally, we implemented and compared the different feature selection algorithms to integrate the structural features, brain networks, and voxel features to optimize the diagnostic identifications of AD using support vector machine (SVM) classifiers. …”
-
107
Supplementary Material 8
Published 2025“…</li><li><b>XGboost: </b>An optimized gradient boosting algorithm that efficiently handles large genomic datasets, commonly used for high-accuracy predictions in <i>E. coli</i> classification.…”
-
108
Table 1_Heavy metal biomarkers and their impact on hearing loss risk: a machine learning framework analysis.docx
Published 2025“…Multiple machine learning algorithms, including Random Forest, XGBoost, Gradient Boosting, Logistic Regression, CatBoost, and MLP, were optimized and evaluated. …”
-
109
Machine Learning-Ready Dataset for Cytotoxicity Prediction of Metal Oxide Nanoparticles
Published 2025“…</p><p dir="ltr">Encoding: Categorical variables such as surface coating and cell type were grouped into logical classes and label-encoded to enable model compatibility.</p><p dir="ltr"><b>Applications and Model Compatibility:</b></p><p dir="ltr">The dataset is optimized for use in supervised learning workflows and has been tested with algorithms such as:</p><p dir="ltr">Gradient Boosting Machines (GBM),</p><p dir="ltr">Support Vector Machines (SVM-RBF),</p><p dir="ltr">Random Forests, and</p><p dir="ltr">Principal Component Analysis (PCA) for feature reduction.…”