shanearora
commited on
Commit
•
98eae16
1
Parent(s):
a77e93a
Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,8 @@ language:
|
|
14 |
|
15 |
<!-- Provide a quick summary of what the model is/does. -->
|
16 |
|
|
|
|
|
17 |
OLMo is a series of **O**pen **L**anguage **Mo**dels designed to enable the science of language models.
|
18 |
The OLMo models are trained on the [Dolma](https://huggingface.co/datasets/allenai/dolma) dataset.
|
19 |
We release all code, checkpoints, logs (coming soon), and details involved in training these models.
|
|
|
14 |
|
15 |
<!-- Provide a quick summary of what the model is/does. -->
|
16 |
|
17 |
+
**For transformers versions v4.40.0 or newer, please use [OLMo 7B Twin 2T HF](https://huggingface.co/allenai/OLMo-7B-Twin-2T-hf) instead.**
|
18 |
+
|
19 |
OLMo is a series of **O**pen **L**anguage **Mo**dels designed to enable the science of language models.
|
20 |
The OLMo models are trained on the [Dolma](https://huggingface.co/datasets/allenai/dolma) dataset.
|
21 |
We release all code, checkpoints, logs (coming soon), and details involved in training these models.
|