يعرض 121 - 140 نتائج من 238 نتيجة بحث عن '(( python code implementing ) OR ( python ((tool implementing) OR (from implementing)) ))', وقت الاستعلام: 0.54s تنقيح النتائج
  1. 121

    The codes and data for "A Graph Convolutional Neural Network-based Method for Predicting Computational Intensity of Geocomputation" حسب FirstName LastName (20554465)

    منشور في 2025
    "…The model results are saved in <code>1point2dem/SampleGeneration/result</code>, and the results for <b>Table 3</b> in the paper are derived from this output.…"
  2. 122

    The codes and data for "A Graph Convolutional Neural Network-based Method for Predicting Computational Intensity of Geocomputation" حسب FirstName LastName (20554465)

    منشور في 2025
    "…The model results are saved in <code>1point2dem/SampleGeneration/result</code>, and the results for <b>Table 3</b> in the paper are derived from this output.…"
  3. 123

    Evaluation and Statistical Analysis Code for "Multi-Task Learning for Joint Fisheye Compression and Perception for Autonomous Driving" حسب Basem Ahmed (18127861)

    منشور في 2025
    "…</li></ul><p dir="ltr">These scripts are implemented in Python using the PyTorch framework and are provided to ensure the reproducibility of the experimental results presented in the manuscript.…"
  4. 124

    Monte Carlo Simulation Code for Evaluating Cognitive Biases in Penalty Shootouts Using ABAB and ABBA Formats حسب Raul MATSUSHITA (10276562)

    منشور في 2024
    "…<p dir="ltr">This Python code implements a Monte Carlo simulation to evaluate the impact of cognitive biases on penalty shootouts under two formats: ABAB (alternating shots) and ABBA (similar to tennis tiebreak format). …"
  5. 125

    <b>Code and derived data for</b><b>Training Sample Location Matters: Accuracy Impacts in LULC Classification</b> حسب Pajtim Zariqi (22155799)

    منشور في 2025
    "…</li><li>Python/Kaggle notebooks (<code>.ipynb</code>): reproducibility pipeline for accuracy metrics and statistical analysis.…"
  6. 126

    <b>Use case codes of the DDS3 and DDS4 datasets for bacillus segmentation and tuberculosis diagnosis, respectively</b> حسب Marly G F Costa (19812192)

    منشور في 2025
    "…<p dir="ltr"><b>Use case codes of the DDS3 and DDS4 datasets for bacillus segmentation and tuberculosis diagnosis, respectively</b></p><p dir="ltr">The code was developed in the Google Collaboratory environment, using Python version 3.7.13, with TensorFlow 2.8.2. …"
  7. 127

    Workflow of a typical Epydemix run. حسب Nicolò Gozzi (8837522)

    منشور في 2025
    "…By lowering the barrier for the implementation of computational and inference approaches, Epydemix makes epidemic modeling more accessible to a wider range of users, from academic researchers to public health professionals.…"
  8. 128

    Data and some code used in the paper:<b>Expansion quantization network: A micro-emotion detection and annotation framework</b> حسب Zhou (20184816)

    منشور في 2025
    "…</p><p dir="ltr">GPU:NVIDIA GeForce RTX 3090 GPU</p><p dir="ltr">Bert-base-cased pre-trained model: https://huggingface.co/google-bert/bert-base-cased</p><p dir="ltr">python=3.7,pytorch=1.9.0,cudatoolkit=11.3.1,cudnn=8.9.7.29.…"
  9. 129
  10. 130

    Number of tweets collected over time. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  11. 131

    Descriptive measures of the dataset. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  12. 132

    Media information. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  13. 133

    Table of the database statistical measures. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  14. 134

    Tweets information. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  15. 135

    Examples of tweets texts (Portuguese). حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  16. 136

    Methodological flowchart. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  17. 137

    Number of tweets collected per query and type. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  18. 138

    Examples of tweets texts (English). حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  19. 139

    Users information. حسب Sylvia Iasulaitis (8301189)

    منشور في 2025
    "…The process of collecting and creating the database for this study went through three major stages, subdivided into several processes: (1) A preliminary analysis of the platform and its operation; (2) Contextual analysis, creation of the conceptual model, and definition of Keywords and (3) Implementation of the Data Collection Strategy. Python algorithms were developed to model each primary collection type. …"
  20. 140

    Data Sheet 1_COCαDA - a fast and scalable algorithm for interatomic contact detection in proteins using Cα distance matrices.pdf حسب Rafael Pereira Lemos (9104911)

    منشور في 2025
    "…COCαDA demonstrated superior performance compared to the other methods, achieving on average 6x faster computation times than advanced data structures like k-d trees from NS, in addition to being simpler to implement and fully customizable. …"