70B LLMs that fit in 24GB VRAM with over 16k context (with exl2 Q4 cache)
-
DeusImperator/Midnight-Miqu-70B-v1.5_exl2_2.4bpw_rpcal
Text Generation • Updated • 7 -
DeusImperator/Midnight-Miqu-70B-v1.5_exl2_2.4bpw_rpcal_mk2
Text Generation • Updated • 5 -
DeusImperator/Midnight-Miqu-70B-v1.5_exl2_2.4bpw
Text Generation • Updated • 7 -
DeusImperator/Dark-Miqu-70B_exl2_2.4bpw
Text Generation • Updated • 2