🧀 BigCheese.ai

Social

DeepSeek open source DeepEP – library for MoE training and Inference

🧀

DeepEP is a communication library tailored for Mixture-of-Experts (MoE) and expert parallelism (EP), providing high-throughput and low-latency GPU kernels for efficient AI parallel processing tasks like MoE dispatch and combine.

  • Efficient expert-parallel library
  • Supports high-throughput GPUS
  • Tailored for MoE and EP
  • Offers low-latency operations
  • Compatible with Hopper GPUs