2x7B AWQ
Collection
Mixture of experts 2 x 7B.
•
20 items
•
Updated
What is it? A 2x7B MoE model for Roleplay(?).
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard
This model is is a Mixure of Experts (MoE) made with the following models:
If you used it, please let me know if it good or not. Thank you :)
Base model
Alsebay/RainyMotip-2x7B