Search alternatives:
process optimization » robust optimization (Expand Search), model optimization (Expand Search), policy optimization (Expand Search)
driven optimization » design optimization (Expand Search)
phase process » whole process (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
process optimization » robust optimization (Expand Search), model optimization (Expand Search), policy optimization (Expand Search)
driven optimization » design optimization (Expand Search)
phase process » whole process (Expand Search)
binary data » primary data (Expand Search), dietary data (Expand Search)
-
1
-
2
Proposed architecture testing phase.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
3
Event-driven data flow processing.
Published 2025“…Subsequently, we implement an optimal binary tree decision-making algorithm, grounded in dynamic programming, to achieve precise allocation of elastic resources within data streams, significantly bolstering resource utilization. …”
-
4
-
5
-
6
Flow diagram of the proposed model.
Published 2025“…<div><p>Machine learning models are increasingly applied to assisted reproductive technologies (ART), yet most studies rely on conventional algorithms with limited optimization. This proof-of-concept study investigates whether a hybrid Logistic Regression–Artificial Bee Colony (LR–ABC) framework can enhance predictive performance in in vitro fertilization (IVF) outcomes while producing interpretable, hypothesis-driven associations with nutritional and pharmaceutical supplement use. …”
-
7
Comparison with existing SOTA techniques.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
8
Proposed inverted residual parallel block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
9
Inverted residual bottleneck block.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
10
Sample classes from the HMDB51 dataset.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
11
Sample classes from UCF101 dataset [40].
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
12
Self-attention module for the features learning.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
13
Residual behavior.
Published 2025“…The proposed architecture is trained on the selected datasets, whereas the hyperparameters are chosen using the particle swarm optimization (PSO) algorithm. The trained model is employed in the testing phase for the feature extraction from the self-attention layer and passed to the shallow wide neural network classifier for the final classification. …”
-
14
-
15
Overall framework diagram.
Published 2025“…Secondly, addressing the issue of weight and threshold initialization in BPNN, the Coati Optimization Algorithm (COA) was employed to optimize the network (COA-BPNN). …”
-
16
Confusion matrix.
Published 2025“…Subsequently, we implement an optimal binary tree decision-making algorithm, grounded in dynamic programming, to achieve precise allocation of elastic resources within data streams, significantly bolstering resource utilization. …”
-
17
Parameter settings.
Published 2025“…Subsequently, we implement an optimal binary tree decision-making algorithm, grounded in dynamic programming, to achieve precise allocation of elastic resources within data streams, significantly bolstering resource utilization. …”
-
18
Dynamic resource allocation process.
Published 2025“…Subsequently, we implement an optimal binary tree decision-making algorithm, grounded in dynamic programming, to achieve precise allocation of elastic resources within data streams, significantly bolstering resource utilization. …”
-
19
Image 1_A multimodal AI-driven framework for cardiovascular screening and risk assessment in diverse athletic populations: innovations in sports cardiology.png
Published 2025“…RSEE projects heterogeneous input data into an exertion-conditioned latent space, aligning model predictions with observed physiological variance and mitigating false positives by explicitly modeling the overlap between athletic remodeling and subclinical pathology.…”
-
20