YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
Llama-3.1-8B-Pruned-4-Layers - GGUF
- Model creator: https://huggingface.co/Na0s/
- Original model: https://huggingface.co/Na0s/Llama-3.1-8B-Pruned-4-Layers/
Original model description:
library_name: transformers tags: - mergekit - merge pipeline_tag: text-generation
Na0s/Llama-3.1-8b-Pruned-4-Layers
This is a merge of meta-llama/Meta-Llama-3.1-8B created using mergekit, with respect to the paper "The Unreasonable Ineffectiveness of the Deeper Layers"
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 23]
model: meta-llama/Meta-Llama-3.1-8B
- sources:
- layer_range: [28, 32]
model: meta-llama/Meta-Llama-3.1-8B
Evaluation
MMLU Pro 0-shot: 0.2642
Evaluation Data
[TIGER-AI-Lab/MMLU-Pro]
Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Downloads last month
- 532