DailyChat-350M / tokenizer_config.json
DarwinAnim8or's picture
Upload 5 files
c1ef7d5
raw
history blame
240 Bytes
{"unk_token": "<|endoftext|>", "bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "add_prefix_space": false, "model_max_length": 2048, "special_tokens_map_file": null, "name_or_path": "gpt2", "tokenizer_class": "CodeGenTokenizer"}