All news
ModelsHugging Face·May 8, 2026

EMO: Pretraining mixture of experts for emergent modularity

Hugging Face just introduced EMO, a pretraining mixture of experts model designed for emergent modularity. This approach enhances model efficiency and adaptability, allowing for more specialized tasks without requiring extensive retraining.

More in Models