بدائل البحث:
python source » pathogen source (توسيع البحث), proton source (توسيع البحث), photon source (توسيع البحث)
source codes » source code (توسيع البحث), source tools (توسيع البحث)
python source » pathogen source (توسيع البحث), proton source (توسيع البحث), photon source (توسيع البحث)
source codes » source code (توسيع البحث), source tools (توسيع البحث)
-
161
Ambient Air Pollutant Dynamics (2010–2025) and the Exceptional Winter 2016–17 Pollution Episode: Implications for a Uranium/Arsenic Exposure Event
منشور في 2025"…Includes imputation statistics, data dictionary, and the Python imputation code (Imputation_Air_Pollutants_NABEL.py). …"
-
162
<b>GFAP Degradome Foundation Atlas</b>
منشور في 2025"…To extract you can use the bash terminal command: <br><b><i>tar -xvJf GFAP_Degradome_Foundation_Atlas_v3.tar.gz</i></b></p><p dir="ltr"><br></p><h3>Codes</h3><p dir="ltr">Dataset generation is reproducible using three open-source tools:<br><b>Python</b>, <b>BLAST</b>, and <b>SAS</b>.…"
-
163
<b>InterHub: A Naturalistic Trajectory Dataset with Dense Interaction for Autonomous Driving</b>
منشور في 2025"…</b></li></ul><p dir="ltr">The Python codes used to process and analyze the dataset can be found at <a href="https://github.com/zxc-tju/InterHub" rel="noreferrer" target="_blank">https://github.com/zxc-tju/InterHub</a>. …"
-
164
Genomic Surveillance of Pemivibart (VYD2311) Escape-Associated Mutations in SARS-CoV-2: December 2025 BioSamples (n=2)
منشور في 2025"…Full source code and version details are available upon request.…"
-
165
SpaMask
منشور في 2024"…<br>The domain identification results of the Tutorial Train_151674.ipynb can be found in the folder named data.<br>Python source code and tutorials can be also accessed at https://github.com/LYxiaotai/SpaMask.…"
-
166
CpG Signature Profiling and Heatmap Visualization of SARS-CoV Genomes: Tracing the Genomic Divergence From SARS-CoV (2003) to SARS-CoV-2 (2019)
منشور في 2025"…</p><p dir="ltr">Heatmap Images :</p><p dir="ltr">Heatmaps for CpG counts and O/E ratios comparing Wuhan-Hu-1 with its closest and most distant relatives.</p><p dir="ltr">Python Script :</p><p dir="ltr">Full Python code used for data processing, distance calculation, and heatmap generation.…"
-
167
<b>The abundance of ground-level atmospheric ice-nucleating particles and aerosol properties </b><b>in the North Slope of Alaska</b>
منشور في 2024"…An individual method-oriented data abstract is available in metadata for each output sub-folder. All Python codes used for processing input data and generating output data are paired with the readme files and archived in folders. …"
-
168
Ontology Element Knowledge Graph(OEKG) and a Case Study of Planting-OEKG
منشور في 2025"…</li><li>Some downloaded open-source ontology files</li><li>The project related code includes three Python files: OWL file parsing, ontology element storage, and ontology basic information storage, which can support the basic construction process of OEKG.…"
-
169
<b>Myelin Basic Protein (MBP) Degradome Foundation Atlas</b>
منشور في 2025"…</li></ul><p dir="ltr">Dataset Contents</p><p dir="ltr">The compressed archive includes:</p><ul><li>MBP_WT.csv — full degradome of wild-type MBP</li><li>MBP_R159K.csv — full degradome of the R159K MBP variant</li><li>MBP_Degradome_All.csv — merged and unified dataset combining all included MBP variants</li><li>Python source code used to generate all peptide fragments and compute peptide features</li><li>README.txt — structured technical documentation</li><li>requirements.txt — software dependency list for reproducibility</li></ul><h3>Data Format</h3><p dir="ltr">All files are provided in CSV (comma-separated values) format and include the following annotated fields:</p><ul><li><code>id</code> — structured peptide identifier (e.g., MBP_WT_10_42)</li><li><code>peptide</code> — amino acid sequence</li><li><code>start</code>, <code>stop</code> — cleavage positions</li><li><code>mz</code> — mass-to-charge ratio</li><li><code>Da</code> — molecular weight</li><li><code>Boman</code> — Boman index</li><li><code>charge</code> — net charge</li><li><code>pI</code> — isoelectric point</li><li><code>hydrophobicity</code></li><li><code>instability_index</code></li><li><code>aliphatic_index</code></li></ul><p dir="ltr">These properties enable integration into R, Python, SAS, Matlab, and machine learning workflows.…"
-
170
Auxiliary and validation data for SAGEA-fluid
منشور في 2025"…<p dir="ltr">SAGEA-fluid is an open-source Python-based solver designed to evaluate the self-attraction and loading effect (SAL), geocenter motion (GCM), and Earth orientation parameters (EOPs) including polar motion (PM) and length-of-day (LOD) variations induced by surface fluid redistribution. …"
-
171
Spatial information for Spain
منشور في 2024"…</b></p><p dir="ltr"><b><u>Source:</u></b><b> </b><a href="https://geodata.ucdavis.edu/gadm/gadm4.1/json/gadm41_ESP_4.json.zip" target="_blank">https://geodata.ucdavis.edu/gadm/gadm4.1/json/gadm41_ESP_4.json.zip</a></p><p dir="ltr">We need to filter by municipalities using the "CODIMUNI" column (code associated to each municipality) in "divisions-administratives-v2r1-municipis-1000000-20240705.json".…"
-
172
Supplementary Material for review (<b>Revealing the co-occurrence patterns of public emotions from social media data</b>)
منشور في 2025"…</p><p dir="ltr">This document provides a detailed explanation of how to reproduce all experimental results, figures and tables presented in the paper, and the key indicators in the abstract by using the shared datasets and source code. The aim is to ensure full reproducibility of the study.…"
-
173
MCCN Case Study 4 - Validating gridded data products
منشور في 2025"…</p><p dir="ltr">The dataset contains input files for the case study (source_data), RO-Crate metadata (ro-crate-metadata.json), results from the case study (results), and Jupyter Notebook (MCCN-CASE 4.ipynb)</p><h4><b>Research Activity Identifier (RAiD)</b></h4><p dir="ltr">RAiD: https://doi.org/10.26292/8679d473</p><h4><b>Case Studies</b></h4><p dir="ltr">This repository contains code and sample data for the following case studies. …"
-
174
The artifacts and data for the paper "DD4AV: Detecting Atomicity Violations in Interrupt-Driven Programs with Guided Concolic Execution and Filtering" (OOPSLA 2025)
منشور في 2025"…</li><li><code><strong>wllvm</strong></code>: The third-party library project WLLVM provides tools for building whole-program LLVM bitcode files from unmodified C or C++ source packages.…"
-
175
Processed .h5ad Anndata file from Downsampled data (to use for "Visualization")
منشور في 2024"…To address this gap, we developed CAFE as an open-source Python-based web application with a graphical user interface. …"
-
176
Albumin Degradome Foundation Atlas
منشور في 2025"…</p><h2><b>Data Format and Access</b></h2><ul><li><b>Primary file:</b> <code>Albumin_Degradome_Foundation_Atlas_v1.tar.xz</code><br>(contains all peptide tables in standard CSV format)</li><li><b>File Type:</b> ASCII comma-separated values (CSV)</li><li><b>Compression:</b> <code>xz -9 -T0</code> for maximal CPU-parallelised compression</li><li><b>Compatibility:</b></li><li><ul><li>R, Python, MATLAB, SAS</li><li>Excel, LibreOffice</li><li>Any proteomics workflow (e.g., Skyline, MaxQuant preprocessing, MS/MS spectral libraries)</li></ul></li></ul><h2><b>FAIR Principles</b></h2><p dir="ltr">This dataset is fully aligned with FAIR data standards:</p><ul><li><b>Findable:</b> Rich metadata, stable DOI, search-optimised description</li><li><b>Accessible:</b> Open-access Figshare repository</li><li><b>Interoperable:</b> Standard numeric and CSV formats</li><li><b>Reusable:</b> Transparent, reproducible Python source code included</li></ul><h2><b>Applications</b></h2><p dir="ltr">The Albumin Degradome Foundation Atlas supports research across multiple biomedical domains:</p><ul><li>Biomarker development in liver disease, kidney dysfunction, inflammation, and systemic disorders</li><li>Mass-spectrometry method development</li><li>Computational proteomics and peptide modelling</li><li>Autoimmunity and neo-epitope analysis</li><li>Protein–peptide interaction studies</li><li>Proteolytic pathway mapping and degradomics</li></ul><h2><b>Versioning and Future Work</b></h2><p dir="ltr">This is <b>Version 1</b> of the Albumin Degradome Foundation Atlas. …"
-
177
entity-poster.pdf
منشور في 2025"…Entity includes an open-source companion package, nt2py, for interactive simulation analysis, visualization, and post-processing in Python. …"
-
178
Downsampled data from FlowRepository: FR-FCM-Z3WR
منشور في 2024"…To address this gap, we developed CAFE as an open-source Python-based web application with a graphical user interface. …"
-
179
Pteredactyl: Patient Clinical Free-Text Redaction Software
منشور في 2025"…</li><li>We have submitted the code to <a href="https://www.ohdsi.org/" rel="noopener noreferrer" target="_blank">OHDSI</a> as an abstract and aim strongly to incorporate this into a wider open-source effort to solve intractable clinical informatics problems.…"
-
180
Trustworthy and Ethical AI for Intrusion Detection in Healthcare IoT (IoMT) Systems: An Agentic Decision Loop Framework
منشور في 2025"…</p><h2>️ Repository Structure</h2><pre><pre>agentic-ethical-ids-healthcare/<br>│<br>├── src/ # Source code for model, rule engine, and agent<br>│ ├── train_agent.py<br>│ ├── ethical_engine.py<br>│ ├── detector_model.py<br>│ └── utils/<br>│<br>├── data/ # Links or sample data subsets<br>│ ├── CIC-IoMT-2024/ <br>│ └── CSE-CIC-IDS2018/<br>│<br>├── notebooks/ # Jupyter notebooks for training and analysis<br>│<br>├── models/ # Pretrained model checkpoints (.pth, .pkl)<br>│<br>├── results/ # Evaluation outputs and figures<br>│<br>├── requirements.txt # Python dependencies<br>├── LICENSE # MIT License for open research use<br>└── README.md # Project documentation<br></pre></pre><h2>⚙️ Setup and Installation</h2><p dir="ltr">Clone the repository and set up your environment:</p><pre><pre>git clone https://github.com/ibrahimadabara01/agentic-ethical-ids-healthcare.git<br>cd agentic-ethical-ids-healthcare<br>python -m venv venv<br>source venv/bin/activate # On Windows: venv\Scripts\activate<br>pip install -r requirements.txt<br></pre></pre><h2> Datasets</h2><p dir="ltr">This project uses three datasets:</p><table><tr><th><p dir="ltr">Dataset</p></th><th><p dir="ltr">Purpose</p></th><th><p dir="ltr">Source</p></th></tr><tr><td><b>CIC-IoMT 2024</b></td><td><p dir="ltr">Primary IoMT intrusion detection dataset</p></td><td><a href="https://www.unb.ca/cic/datasets/index.html" rel="noopener" target="_new">Canadian Institute for Cybersecurity</a></td></tr><tr><td><b>CSE-CIC-IDS2018</b></td><td><p dir="ltr">Domain-shift evaluation</p></td><td><a href="https://www.unb.ca/cic/datasets/ids-2018.html" rel="noopener" target="_new">CIC Dataset Portal</a></td></tr><tr><td><b>MIMIC-IV (Demo)</b></td><td><p dir="ltr">Clinical context signals</p></td><td><a href="https://physionet.org/content/mimic-iv-demo/2.2/" rel="noopener" target="_new">PhysioNet</a></td></tr></table><blockquote><p dir="ltr">⚠️ Note: All datasets are publicly available. …"