metadata
license: mit
datasets:
- chloeliu/reddit_nosleep_posts
language:
- en
tags:
- fun
- horror
- writing
widget:
- text: '[WP] We don''t go to ravenholm anymore [RESPONSE] '
example_title: Ravenholm
- text: '[WP] The man in the corner of my window [RESPONSE] '
example_title: The man in the corner
co2_eq_emissions:
emissions: 70
source: https://mlco2.github.io/impact/#compute
training_type: fine-tuning
geographical_location: Oregon, USA
hardware_used: 1x T4, Google Colab
GPT-NoSleep-1.5b
This is the largest release of GPT-NoSleep; a finetuned version of GPT2-XL on the 'reddit-nosleep-posts' dataset. Smaller releases include:
And the accompanying prompt generator can be found here:
Training Procedure
This was trained on the 'reddt-nosleep-posts' dataset, on Google Colab. This model was trained for 2 epochs with learning rate 1e-2. Special thanks for Skyler for helping to train this large of a model!
Biases & Limitations
This likely contains the same biases and limitations as the original GPT2 that it is based on, and additionally heavy biases from the dataset. It can generate output that is not meant for all audiences, seeing as it's purpose is to generate horror stories.
Intended Use
This model is meant for fun, nothing else.