Fix wrong model_max_length
#3
by
andstor
- opened
The model has a context window of 2048 (n_positions
). The tokenizer should also support the same length.
The model has a context window of 2048 (n_positions
). The tokenizer should also support the same length.