DynMoE model checkpoints and paper on huggingface
-
LINs-lab/DynMoE-StableLM-1.6B
Image-Text-to-Text • Updated • 21 • 2 -
LINs-lab/DynMoE-Qwen-1.8B
Image-Text-to-Text • Updated • 11 • 2 -
LINs-lab/DynMoE-Phi-2-2.7B
Image-Text-to-Text • Updated • 19 • 3 -
Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
Paper • 2405.14297 • Published • 2