DNABERT-H-2M
<pre>DNABERT-H-2M is a DNA language model based on DNABERT-S (1) and trained using Hierarchical Multi-label Contrastive Learning from Use All The Labels [2].<br><br>[1] https://github.com/MAGICS-LAB/DNABERT_S<br>[2] (https://arxiv.org/abs/2204.13207)<br></pre><...
Saved in:
| Main Author: | Anders Hjulmand (20361705) (author) |
|---|---|
| Published: |
2025
|
| Subjects: | |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Genosophus: A Dynamical-Systems Diagnostic Engine for Neural Representation Analysis
by: Alan Glanz (22109698)
Published: (2025) -
The architecture of DNABERT2-enhancer includes two models: (A) DNABERT-2 model (B) CNN model.
by: Tong Wang (87696)
Published: (2025) -
Data Sheet 1_AI vs. teacher feedback on EFL argumentative writing: a quantitative study.docx
by: Areen Alnemrat (21803618)
Published: (2025) -
ArticleSet2
by: Zsolt Tibor Dr. habil. Kosztyán (4952893)
Published: (2025) -
Project Summary: Developmental testing of the professional learning badge assessment strategy and tools of ‘Learning Languages with Senior Learners’ - producing an assessment model for a suite of 6 badged CPD courses for social care staff
by: Sylvia Warnecke (18624320)
Published: (2025)