Search alternatives:
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search)
python function » protein function (Expand Search)
algorithm from » algorithm flow (Expand Search)
algorithm both » algorithm blood (Expand Search), algorithm b (Expand Search), algorithm etc (Expand Search)
from function » from functional (Expand Search), fc function (Expand Search)
both function » body function (Expand Search), growth function (Expand Search), beach function (Expand Search)
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search)
python function » protein function (Expand Search)
algorithm from » algorithm flow (Expand Search)
algorithm both » algorithm blood (Expand Search), algorithm b (Expand Search), algorithm etc (Expand Search)
from function » from functional (Expand Search), fc function (Expand Search)
both function » body function (Expand Search), growth function (Expand Search), beach function (Expand Search)
-
281
-
282
-
283
-
284
-
285
Table 1_SRC is a potential target of Arctigenin in treating triple-negative breast cancer: based on machine learning algorithms, molecular modeling and in Vitro test.xlsx
Published 2025“…</p>Results<p>Our study identified 183 AG-related targets, 5,193 differentially expressed genes, and 6,173 co-expression module genes associated with TNBC. Machine learning algorithms pinpointed 4 hub genes from 28 intersecting targets. …”
-
286
Table 2_SRC is a potential target of Arctigenin in treating triple-negative breast cancer: based on machine learning algorithms, molecular modeling and in Vitro test.xlsx
Published 2025“…</p>Results<p>Our study identified 183 AG-related targets, 5,193 differentially expressed genes, and 6,173 co-expression module genes associated with TNBC. Machine learning algorithms pinpointed 4 hub genes from 28 intersecting targets. …”
-
287
Table 3_SRC is a potential target of Arctigenin in treating triple-negative breast cancer: based on machine learning algorithms, molecular modeling and in Vitro test.docx
Published 2025“…</p>Results<p>Our study identified 183 AG-related targets, 5,193 differentially expressed genes, and 6,173 co-expression module genes associated with TNBC. Machine learning algorithms pinpointed 4 hub genes from 28 intersecting targets. …”
-
288
-
289
Python code for a rule-based NLP model for mapping circular economy indicators to SDGs
Published 2025“…The package includes:</p><ul><li>The complete Python codebase implementing the classification algorithm</li><li>A detailed manual outlining model features, requirements, and usage instructions</li><li>Sample input CSV files and corresponding processed output files to demonstrate functionality</li><li>Keyword dictionaries for all 17 SDGs, distinguishing strong and weak matches</li></ul><p dir="ltr">These materials enable full reproducibility of the study, facilitate adaptation for related research, and offer transparency in the methodological framework.…”
-
290
Quantum Simulation of Molecular Dynamics ProcessesA Benchmark Study Using a Classical Simulator and Present-Day Quantum Hardware
Published 2025“…Although Qiskit provides a general method for initializing wave functions, in most cases it generates deep quantum circuits. …”
-
291
Multimodal reference functions.
Published 2025“…Utilizing the diabetes dataset from 130 U.S. hospitals, the LGWO-BP algorithm achieved a precision rate of 0.97, a sensitivity of 1.00, a correct classification rate of 0.99, a harmonic mean of precision and recall (F1-score) of 0.98, and an area under the ROC curve (AUC) of 1.00. …”
-
292
-
293
-
294
-
295
-
296
Rosenbrock function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
297
Rosenbrock function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
298
Levy function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
299
Rastrigin function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
300
Levy function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”