gpt2-medium-german-finetune-oscar / tokenizer_config.json
vsc33723
Add secondary files
280774c
raw
history blame contribute delete
92 Bytes
{"pad_token": "<|endoftext|>", "special_tokens_map_file": null, "full_tokenizer_file": null}