Open-source, high-performance AI model with advanced reasoning
Powerful AI language model (MoE) optimized for efficiency/performance
Strong, Economical, and Efficient Mixture-of-Experts Language Model
Contexts Optical Compression
Mixture-of-Experts Vision-Language Models for Advanced Multimodal
DeepSeek Coder: Let the Code Write Itself
Pushing the Limits of Mathematical Reasoning in Open Language Models
Towards Real-World Vision-Language Understanding
DeepSeek LLM: Let there be answers
An experimental version of DeepSeek model
DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models
Visual Causal Flow
Advancing Formal Mathematical Reasoning via Reinforcement Learning
Towards Ultimate Expert Specialization in Mixture-of-Experts Language
From Vibe Coding to Agentic Engineering
Analyze computation-communication overlap in V3/R1
A bidirectional pipeline parallelism algorithm
A high-performance distributed file system
Agentic, Reasoning, and Coding (ARC) foundation models
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
Production-tested AI infrastructure tools
Towards self-verifiable mathematical reasoning
Unified Multimodal Understanding and Generation Models
A lightweight data processing framework built on DuckDB and 3FS
High-efficiency reasoning and agentic intelligence model