Search alternatives:
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
from optimization » fox optimization (Expand Search), swarm optimization (Expand Search), codon optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based from » based food (Expand Search), used from (Expand Search), based arm (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
from optimization » fox optimization (Expand Search), swarm optimization (Expand Search), codon optimization (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
based from » based food (Expand Search), used from (Expand Search), based arm (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
-
21
-
22
Data_Sheet_1_A Global Optimizer for Nanoclusters.PDF
Published 2019“…This method is implemented in PyAR (https://github.com/anooplab/pyar) program. The global optimization in PyAR involves two parts, generation of several trial geometries and gradient-based local optimization of the trial geometries. …”
-
23
-
24
Classification baseline performance.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
25
Feature selection results.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
26
ANOVA test result.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
27
Summary of literature review.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
28
-
29
Hierarchical clustering to infer a binary tree with <i>K</i> = 4 sampled populations.
Published 2023“…After <i>K</i> − 2 = 2 steps, the resulting tree is binary and the algorithm stops.</p>…”
-
30
Algoritmo de clasificación de expresiones de odio por tipos en español (Algorithm for classifying hate expressions by type in Spanish)
Published 2024“…</li></ul><p dir="ltr"><b>File Structure</b></p><p dir="ltr">The code generates and saves:</p><ul><li>Weights of the trained model (.h5)</li><li>Configured tokenizer</li><li>Training history in CSV</li><li>Requirements file</li></ul><p dir="ltr"><b>Important Notes</b></p><ul><li>The model excludes category 2 during training</li><li>Implements transfer learning from a pre-trained model for binary hate detection</li><li>Includes early stopping callbacks to prevent overfitting</li><li>Uses class weighting to handle category imbalances</li></ul><p dir="ltr">The process of creating this algorithm is explained in the technical report located at: Blanco-Valencia, X., De Gregorio-Vicente, O., Ruiz Iniesta, A., & Said-Hung, E. (2025). …”
-
31
-
32
-
33
-
34
-
35
SHAP bar plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
36
Sample screening flowchart.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
37
Descriptive statistics for variables.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
38
SHAP summary plot.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
39
ROC curves for the test set of four models.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”
-
40
Display of the web prediction interface.
Published 2025“…</p><p>Results</p><p>Logistic regression analysis identified age, hemoglobin concentration, education level, and social participation as significant factors influencing CI. Models based on NNET, RF, LR, and SVM algorithms were developed, achieving AUC of 0.918, 0.889, 0.872, and 0.760, respectively, on the test set. …”