Search alternatives:
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search), algorithm both (Expand Search)
python function » protein function (Expand Search)
algorithm from » algorithm flow (Expand Search)
from function » from functional (Expand Search), fc function (Expand Search)
algorithm etc » algorithm _ (Expand Search), algorithm b (Expand Search), algorithm a (Expand Search)
etc function » spc function (Expand Search), fc function (Expand Search), npc function (Expand Search)
algorithm python » algorithm within (Expand Search), algorithms within (Expand Search), algorithm both (Expand Search)
python function » protein function (Expand Search)
algorithm from » algorithm flow (Expand Search)
from function » from functional (Expand Search), fc function (Expand Search)
algorithm etc » algorithm _ (Expand Search), algorithm b (Expand Search), algorithm a (Expand Search)
etc function » spc function (Expand Search), fc function (Expand Search), npc function (Expand Search)
-
181
-
182
Python code for a rule-based NLP model for mapping circular economy indicators to SDGs
Published 2025“…The package includes:</p><ul><li>The complete Python codebase implementing the classification algorithm</li><li>A detailed manual outlining model features, requirements, and usage instructions</li><li>Sample input CSV files and corresponding processed output files to demonstrate functionality</li><li>Keyword dictionaries for all 17 SDGs, distinguishing strong and weak matches</li></ul><p dir="ltr">These materials enable full reproducibility of the study, facilitate adaptation for related research, and offer transparency in the methodological framework.…”
-
183
Multimodal reference functions.
Published 2025“…Utilizing the diabetes dataset from 130 U.S. hospitals, the LGWO-BP algorithm achieved a precision rate of 0.97, a sensitivity of 1.00, a correct classification rate of 0.99, a harmonic mean of precision and recall (F1-score) of 0.98, and an area under the ROC curve (AUC) of 1.00. …”
-
184
-
185
Flow chart diagram of blind quantum algorithm.
Published 2024“…Our study addresses five major components of the quantum method to overcome these challenges: lattice-based cryptography, fully homomorphic algorithms, quantum key distribution, quantum hash functions, and blind quantum algorithms. …”
-
186
-
187
-
188
-
189
-
190
-
191
Rosenbrock function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
192
Rosenbrock function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
193
Levy function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
194
Rastrigin function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
195
Levy function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
196
Rastrigin function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
197
Levy function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
198
Levy function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
199
Rastrigin function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”
-
200
Rastrigin function losses for .
Published 2025“…The approach leverages gradient information from neural networks to guide SLSQP optimization while maintaining XGBoost’s prediction precision. …”