Reimu Hakurei
commited on
Commit
•
cc2e78a
1
Parent(s):
635c970
Add README.md
Browse files
README.md
ADDED
@@ -0,0 +1,67 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
tags:
|
5 |
+
- pytorch
|
6 |
+
- causal-lm
|
7 |
+
license: mit
|
8 |
+
|
9 |
+
---
|
10 |
+
|
11 |
+
# Lit-6B - A Large Fine-tuned Model For Fictional Storytelling
|
12 |
+
|
13 |
+
Lit-6B is a GPT-J 6B model fine-tuned on 2GB of a diverse range of light novels, erotica, and annotated literature for the purpose of generating novel-like fictional text.
|
14 |
+
|
15 |
+
## Model Description
|
16 |
+
|
17 |
+
The model used for fine-tuning is [GPT-J](https://github.com/kingoflolz/mesh-transformer-jax), which is a 6 billion parameter auto-regressive language model trained on [The Pile](https://pile.eleuther.ai/).
|
18 |
+
|
19 |
+
## Training Data & Annotative Prompting
|
20 |
+
|
21 |
+
The data used in fine-tuning has been gathered from various sources such as the [Gutenberg Project](https://www.gutenberg.org/). The annotated fiction dataset has prepended tags to assist in generating towards a particular style. Here is an example prompt that shows how to use the annotations.
|
22 |
+
|
23 |
+
```
|
24 |
+
[ Title: The Dunwich Horror; Author: H. P. Lovecraft; Genre: Horror; Tags: 3rdperson, scary; Style: Dark ]
|
25 |
+
***
|
26 |
+
When a traveler in north central Massachusetts takes the wrong fork...
|
27 |
+
```
|
28 |
+
|
29 |
+
The annotations can be mixed and matched to help generate towards a specific style.
|
30 |
+
|
31 |
+
## Downstream Uses
|
32 |
+
|
33 |
+
This model can be used for entertainment purposes and as a creative writing assistant for fiction writers.
|
34 |
+
|
35 |
+
## Example Code
|
36 |
+
|
37 |
+
```
|
38 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
39 |
+
|
40 |
+
model = AutoModelForCausalLM.from_pretrained('hakurei/lit-6B')
|
41 |
+
tokenizer = AutoTokenizer.from_pretrained('hakurei/lit-6B')
|
42 |
+
|
43 |
+
prompt = '''[ Title: The Dunwich Horror; Author: H. P. Lovecraft; Genre: Horror ]
|
44 |
+
***
|
45 |
+
When a traveler'''
|
46 |
+
|
47 |
+
input_ids = tokenizer.encode(prompt, return_tensors='pt')
|
48 |
+
output = model.generate(input_ids, do_sample=True, temperature=1.0, top_p=0.9, repetition_penalty=1.2, max_length=len(input_ids[0])+100, pad_token_id=tokenizer.eos_token_id)
|
49 |
+
|
50 |
+
generated_text = tokenizer.decode(output[0])
|
51 |
+
print(generated_text)
|
52 |
+
```
|
53 |
+
|
54 |
+
An example output from this code produces a result that will look similar to:
|
55 |
+
|
56 |
+
```
|
57 |
+
[ Title: The Dunwich Horror; Author: H. P. Lovecraft; Genre: Horror ]
|
58 |
+
***
|
59 |
+
When a traveler comes to an unknown region, his thoughts turn inevitably towards the old gods and legends which cluster around its appearance. It is not that he believes in them or suspects their reality—but merely because they are present somewhere else in creation just as truly as himself, and so belong of necessity in any landscape whose features cannot be altogether strange to him. Moreover, man has been prone from ancient times to brood over those things most connected with the places where he dwells. Thus the Olympian deities who ruled Hyper
|
60 |
+
```
|
61 |
+
|
62 |
+
## Team members and Acknowledgements
|
63 |
+
|
64 |
+
This project would not have been possible without the computational resources graciously provided by the [TPU Research Cloud](https://sites.research.google/trc/)
|
65 |
+
|
66 |
+
- [Anthony Mercurio](https://github.com/harubaru)
|
67 |
+
- Imperishable_NEET
|