Search alternatives:
using optimization » joint optimization (Expand Search), design optimization (Expand Search), step optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary sample » final sample (Expand Search), binary people (Expand Search), intra sample (Expand Search)
sample using » samples using (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
using optimization » joint optimization (Expand Search), design optimization (Expand Search), step optimization (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
binary sample » final sample (Expand Search), binary people (Expand Search), intra sample (Expand Search)
sample using » samples using (Expand Search)
binary based » library based (Expand Search), linac based (Expand Search), binary mask (Expand Search)
-
1
ROC curve for binary classification.
Published 2024“…The model was trained and evaluated using a 10-fold cross-validation sampling approach with a learning rate of 0.001 and 200 training epochs at each instance. …”
-
2
Confusion matrix for binary classification.
Published 2024“…The model was trained and evaluated using a 10-fold cross-validation sampling approach with a learning rate of 0.001 and 200 training epochs at each instance. …”
-
3
-
4
<i>hi</i>PRS algorithm process flow.
Published 2023“…From this dataset we can compute the MI between each interaction and the outcome and <b>(D)</b> obtain a ranked list (<i>I</i><sub><i>δ</i></sub>) based on this metric. <b>(E)</b> Starting from the interaction at the top of <i>I</i><sub><i>δ</i></sub>, <i>hi</i>PRS constructs <i>I</i><sub><i>K</i></sub>, selecting <i>K</i> (where <i>K</i> is user-specified) terms through the greedy optimization of the ratio between MI (<i>relevance</i>) and a suitable measure of similarity for interactions (<i>redundancy)</i> (cf. …”
-
5
-
6
-
7
-
8
MSE for ILSTM algorithm in binary classification.
Published 2023“…In this paper, a novel, and improved version of the Long Short-Term Memory (ILSTM) algorithm was proposed. The ILSTM is based on the novel integration of the chaotic butterfly optimization algorithm (CBOA) and particle swarm optimization (PSO) to improve the accuracy of the LSTM algorithm. …”
-
9
Python-Based Algorithm for Estimating NRTL Model Parameters with UNIFAC Model Simulation Results
Published 2025Subjects: -
10
DE algorithm flow.
Published 2025“…<div><p>To solve the problems of insufficient global optimization ability and easy loss of population diversity in building interior layout design, this study proposes a novel layout optimization model integrating interactive genetic algorithm and improved differential evolutionary algorithm to improve the global optimization ability and maintain population diversity in building layout design. …”
-
11
Test results of different algorithms.
Published 2025“…<div><p>To solve the problems of insufficient global optimization ability and easy loss of population diversity in building interior layout design, this study proposes a novel layout optimization model integrating interactive genetic algorithm and improved differential evolutionary algorithm to improve the global optimization ability and maintain population diversity in building layout design. …”
-
12
Summary of existing CNN models.
Published 2024“…The model was trained and evaluated using a 10-fold cross-validation sampling approach with a learning rate of 0.001 and 200 training epochs at each instance. …”
-
13
-
14
Algorithm for generating hyperparameter.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
15
Flowchart scheme of the ML-based model.
Published 2024“…<b>I)</b> Testing data consisting of 20% of the entire dataset. <b>J)</b> Optimization of hyperparameter tuning. <b>K)</b> Algorithm selection from all models. …”
-
16
Results of machine learning algorithm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
17
ROC comparison of machine learning algorithm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”
-
18
-
19
-
20
Best optimizer results of Lightbgm.
Published 2024“…To optimize the parameters of the machine learning algorithms, hyperparameter optimization with a genetic algorithm is proposed and to reduce the size of the feature set, feature selection is performed using binary grey wolf optimization algorithm. …”