pszemraj's picture
Update README.md
e19495a
|
raw
history blame
889 Bytes
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
inference: false
datasets:
  - the_pile_books3

mpt-7b-storywriter: sharded

Open In Colab

This is a version of the mpt-7b-storywriter model, sharded to 2 GB chunks for low-RAM loading (i.e. Colab). The weights are stored in bfloat16 so in theory you can run this on CPU, though it may take forever.

Please refer to the previously linked repo for details on usage/implementation/etc. This model was downloaded from the original repo under Apache-2.0 and is redistributed under the same license.


More details/usage to be added later