GIGABATEMAN-7B-GGUF / README.md
DZgas's picture
Update README.md
a99c3ed verified
metadata
language:
  - en
pipeline_tag: text-generation
tags:
  - quantized
  - 2-bit
  - 4-bit
  - 5-bit
  - 6-bit
  - 8-bit
  - GGUF
  - text2text-generation
  - mistral
  - roleplay
  - merge
base_model:
  - KatyTheCutie/LemonadeRP-4.5.3
  - LakoMoor/Silicon-Alice-7B
  - Endevor/InfinityRP-v1-7B
  - HuggingFaceH4/zephyr-7b-beta
model_name: GIGABATEMAN-7B
model_creator: DZgas
quantized_by: DZgas

This is a GGUF variant of GIGABATEMAN-7B model. Use with koboldcpp (do not use GPT4ALL)

The most UNcensored model that I know.