Search alternatives:
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
work optimization » wolf optimization (Expand Search), swarm optimization (Expand Search), dose optimization (Expand Search)
library based » laboratory based (Expand Search)
based work » based network (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
model optimization » codon optimization (Expand Search), global optimization (Expand Search), based optimization (Expand Search)
work optimization » wolf optimization (Expand Search), swarm optimization (Expand Search), dose optimization (Expand Search)
library based » laboratory based (Expand Search)
based work » based network (Expand Search)
binary b » binary _ (Expand Search)
b model » _ model (Expand Search), a model (Expand Search), 2 model (Expand Search)
-
1
-
2
-
3
-
4
An optimal solution for the HFS instance.
Published 2025“…In recent years, due to the advantages of nonlinear access and fully parallel processing, the probe machine has shown powerful computing capabilities and promising applications in solving various combinatorial optimization problems. This work firstly proposes an Improved Probe Machine with Multi-Level Probe Operations (IPMMPO) and ingeniously designs general data libraries and probe libraries tailored for multi-scenario HFS problems, including HFS with identical parallel machines and HFS with unrelated parallel machines, no-wait scenario, and standard scenario. …”
-
5
Comparison based on hard instances from [79].
Published 2025“…In recent years, due to the advantages of nonlinear access and fully parallel processing, the probe machine has shown powerful computing capabilities and promising applications in solving various combinatorial optimization problems. This work firstly proposes an Improved Probe Machine with Multi-Level Probe Operations (IPMMPO) and ingeniously designs general data libraries and probe libraries tailored for multi-scenario HFS problems, including HFS with identical parallel machines and HFS with unrelated parallel machines, no-wait scenario, and standard scenario. …”
-
6
Fine-Tuning a Genetic Algorithm for CAMD: A Screening-Guided Warm Start
Published 2025“…The proposed method builds on the COSMO-CAMD framework that utilizes a genetic algorithm for solving optimization-based molecular design problems and COSMO-RS for predicting physical properties of molecules. …”
-
7
Fine-Tuning a Genetic Algorithm for CAMD: A Screening-Guided Warm Start
Published 2025“…The proposed method builds on the COSMO-CAMD framework that utilizes a genetic algorithm for solving optimization-based molecular design problems and COSMO-RS for predicting physical properties of molecules. …”
-
8
<i>hi</i>PRS algorithm process flow.
Published 2023“…<b>(B)</b> Focusing on the positive class only, the algorithm exploits FIM (<i>apriori</i> algorithm) to build a list of candidate interactions of any desired order, retaining those that have an empirical frequency above a given threshold <i>δ</i>. …”
-
9
-
10
-
11
-
12
-
13
-
14
-
15
-
16
Classification baseline performance.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
17
Feature selection results.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
18
ANOVA test result.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
19
Summary of literature review.
Published 2025“…The contributions include developing a baseline Convolutional Neural Network (CNN) that achieves an initial accuracy of 86.29%, surpassing existing state-of-the-art deep learning models. Further integrate the binary variant of OcOA (bOcOA) for effective feature selection, which reduces the average classification error to 0.4237 and increases CNN accuracy to 93.48%. …”
-
20