Which models are actually merged?
#3
by
kurnevsky
- opened
The readme says it's cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
and teknium/OpenHermes-2.5-Mistral-7B
but mergekit_moe_config.yml
contains cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
and cognitivecomputations/dolphin-2.1-mistral-7b
.
I am merging a new configuration because there seems to be some pollution of the repository. Here is the new config
base_model: cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
positive_prompts:
- source_model: mlabonne/NeuralBeagle14-7B
positive_prompts:
kurnevsky
changed discussion status to
closed