Joana Palés Huix
commited on
Commit
•
e6ce5e2
1
Parent(s):
93debd4
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,22 @@
|
|
1 |
---
|
|
|
|
|
2 |
license: mit
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
tags:
|
3 |
+
- DNA
|
4 |
license: mit
|
5 |
---
|
6 |
+
|
7 |
+
## MiniDNA mini model
|
8 |
+
|
9 |
+
This is a distilled version of [DNABERT](https://github.com/jerryji1993/DNABERT) by using MiniLM technique. It has a BERT architecture with 3 layers and 384 hidden units, pre-trained on 6-mer DNA sequences. For more details on the pre-training scheme and methods, please check the original thesis report _[link to be added]_.
|
10 |
+
|
11 |
+
|
12 |
+
## How to Use
|
13 |
+
|
14 |
+
The model can be used to fine-tune on a downstream genomic task, e.g. promoter identification.
|
15 |
+
|
16 |
+
```python
|
17 |
+
import torch
|
18 |
+
from transformers import BertForSequenceClassification
|
19 |
+
model = BertForSequenceClassification.from_pretrained('Peltarion/dnabert-minilm-mini')
|
20 |
+
```
|
21 |
+
|
22 |
+
More details on how to fine-tune the model, dataset and additional source codes are available on [github.com/joanaapa/Distillation-DNABERT-Promoter](https://github.com/joanaapa/Distillation-DNABERT-Promoter).
|