GGUF
File size: 5,899 Bytes
84d2f89
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15279b3
84d2f89
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
---
license: llama3
datasets:
- NobodyExistsOnTheInternet/ToxicQAFinal
---

# Llama-3-Alpha-Centauri-v0.1-GGUF

<img src="alpha_centauri_banner.png" alt="" style="width:500px;height:400px;"/>

**Image generated with [https://huggingface.co/PixArt-alpha/PixArt-Sigma-XL-2-1024-MS](https://huggingface.co/PixArt-alpha/PixArt-Sigma-XL-2-1024-MS).**

---

## Disclaimer

**Note:** All models and LoRAs from the **Centaurus** series were created with the sole purpose of research. The usage of this model and/or its related LoRA implies agreement with the following terms:

- The user is responsible for what they might do with it, including how the output of the model is interpreted and used;
- The user should not use the model and its outputs for any illegal purposes;
- The user is the only one resposible for any misuse or negative consequences from using this model and/or its related LoRA.

I do not endorse any particular perspectives presented in the training data.

---

## Centaurus Series

This series aims to develop highly uncensored Large Language Models (LLMs) with the following focuses:

- Science, Technology, Engineering, and Mathematics (STEM)
- Computer Science (including programming)
- Social Sciences

And several key cognitive skills, including but not limited to:

- Reasoning and logical deduction
- Critical thinking
- Analysis

While maintaining strong overall knowledge and expertise, the models will undergo refinement through:

- Fine-tuning processes
- Model merging techniques including Mixture of Experts (MoE)

Please note that these models are experimental and may demonstrate varied levels of effectiveness. Your feedback, critique, or queries are most welcome for improvement purposes.

## Base

This model and its related LoRA was fine-tuned on [https://huggingface.co/failspy/Meta-Llama-3-8B-Instruct-abliterated-v3](https://huggingface.co/failspy/Meta-Llama-3-8B-Instruct-abliterated-v3).

## LoRA

The LoRA merged with the base model is available at [https://huggingface.co/fearlessdots/Llama-3-Alpha-Centauri-v0.1-LoRA](https://huggingface.co/fearlessdots/Llama-3-Alpha-Centauri-v0.1-LoRA).

## Datasets

- [https://huggingface.co/datasets/NobodyExistsOnTheInternet/ToxicQAFinal](https://huggingface.co/datasets/NobodyExistsOnTheInternet/ToxicQAFinal)

## Fine Tuning

### - Quantization Configuration

- load_in_4bit=True
- bnb_4bit_quant_type="fp4"
- bnb_4bit_compute_dtype=compute_dtype
- bnb_4bit_use_double_quant=False

### - PEFT Parameters

- lora_alpha=64
- lora_dropout=0.05
- r=128
- bias="none"

### - Training Arguments

- num_train_epochs=1
- per_device_train_batch_size=1
- gradient_accumulation_steps=4
- optim="adamw_bnb_8bit"
- save_steps=25
- logging_steps=25
- learning_rate=2e-4
- weight_decay=0.001
- fp16=False
- bf16=False
- max_grad_norm=0.3
- max_steps=-1
- warmup_ratio=0.03
- group_by_length=True
- lr_scheduler_type="constant"

## Credits

- Meta ([https://huggingface.co/meta-llama](https://huggingface.co/meta-llama)): for the original Llama-3;
- HuggingFace: for hosting this model and for creating the fine-tuning tools used;
- failspy ([https://huggingface.co/failspy](https://huggingface.co/failspy)): for the base model and the orthogonalization implementation;
- NobodyExistsOnTheInternet ([https://huggingface.co/NobodyExistsOnTheInternet](https://huggingface.co/NobodyExistsOnTheInternet)): for the incredible dataset;
- Undi95 ([https://huggingface.co/Undi95](https://huggingface.co/Undi95)) and Sao10k ([https://huggingface.co/Sao10K](https://huggingface.co/Sao10K)): my main inspirations for doing these models =]

A huge thank you to all of them ☺️

## About Alpha Centauri

**Alpha Centauri** is a triple star system located in the constellation of **Centaurus**. It includes three stars: Rigil Kentaurus (also known as **α Centauri A**), Toliman (or **α Centauri B**), and Proxima Centauri (**α Centauri C**). Proxima Centauri is the nearest star to the Sun, residing at approximately 4.25 light-years (1.3 parsecs) away.

The primary pair, **α Centauri A** and **B**, are both similar to our Sun - **α Centauri A** being a class G star with 1.1 solar masses and 1.5 times the Sun's luminosity; **α Centauri B** having 0.9 solar masses and under half the luminosity of the Sun. They revolve around their shared center every 79 years following an elliptical path, ranging from 35.6 astronomical units apart (nearly Pluto's distance from the Sun) to 11.2 astronomical units apart (around Saturn's distance from the Sun.)

Proxima Centauri, or **α Centauri C**, is a diminutive, dim red dwarf (a class M star) initially unseen to the naked eye. At roughly 4.24 light-years (1.3 parsecs) from us, it lies nearer than **α Centauri AB**, the binary system. Presently, the gap between **Proxima Centauri** and **α Centauri AB** amounts to around 13,000 Astronomical Units (0.21 light-years)—comparable to over 430 times Neptune's orbital radius.

Two confirmed exoplanets accompany Proxima Centauri: **Proxima b**, discovered in 2016, is Earth-sized within the habitable zone; **Proxima d**, revealed in 2022, is a potential sub-Earth close to its host star. Meanwhile, disputes surround **Proxima c**, a mini-Neptune detected in 2019. Intriguingly, hints suggest that **α Centauri A** might possess a Neptune-sized object in its habitable region, but further investigation is required before confirming whether it truly exists and qualifies as a planet. Regarding **α Centauri B**, although once thought to harbor a planet (named **α Cen Bb**), subsequent research invalidated this claim, leaving it currently devoid of identified planets.

**Source:** retrived from [https://en.wikipedia.org/wiki/Alpha_Centauri](https://en.wikipedia.org/wiki/Alpha_Centauri) and processed with [https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1).