Output is being truncated
#6
by
ahmedrizwan239
- opened
Can you try to add the max_length
inference config in the README.md file like this
it might help.
Thankyou so much!
It works, means a lot for your quick reply π€
ahmedrizwan239
changed discussion status to
closed