Skip to content
@OpenSparseLLMs

OpenSparseLLMs

From Shanghai AI Lab

Popular repositories Loading

  1. Linear-MoE Linear-MoE Public

    Python 96 7

  2. LLaMA-MoE-v2 LLaMA-MoE-v2 Public

    🚀 LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training

    Python 86 13

  3. MoM MoM Public

    Python 83 1

  4. Skip-DiT Skip-DiT Public

    ✈️ Towards Stabilized and Efficient Diffusion Transformers through Long-Skip-Connections with Spectral Constraints

    Python 67 1

  5. Linearization Linearization Public

    Python 47 4

  6. CLIP-MoE CLIP-MoE Public

    CLIP-MoE: Mixture of Experts for CLIP

    Python 38

Repositories

Showing 7 of 7 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…