Models/DeepSeek
DeepSeek-V4-Pro
DeepSeek · DeepSeekreleased 2026-04-22open source
DeepSeek's flagship open-weight MoE. 1.6T parameters with 49B activated, 1M-token context, and a hybrid attention scheme (CSA + HCA) that delivers long-context inference at ~27% of V3.2's FLOPs.
Specifications
- Context window
- 1,000,000 tokens
- Parameters
- 1.6T (49B active)
- Modality
- text
- License
- MIT
- Family
- DeepSeek
- Release date
- 2026-04-22
Links
Benchmarks
Benchmark scores ingest in Phase 2 from the Open LLM Leaderboard, LMSys Chatbot Arena, and vendor-published evaluations.
Related news
The aggregator will surface release announcements and analysis tied to DeepSeek-V4-Pro as it goes live.