Efficiently Democratizing Medical LLMs for 50 Languages via a Mixture of Language Family Experts Paper • 2410.10626 • Published Oct 14, 2024 • 39
LongLLaVA: Scaling Multi-modal LLMs to 1000 Images Efficiently via Hybrid Architecture Paper • 2409.02889 • Published Sep 4, 2024 • 54
MiniMA Family Collection The model family derived from MiniMA • 10 items • Updated Sep 2, 2025 • 1
Towards the Law of Capacity Gap in Distilling Language Models Paper • 2311.07052 • Published Nov 13, 2023 • 2