Update DeepSeek LLM DeepSeek License (open weights, commercial OK)

DeepSeek DeepSeek V3.2

Open-weight MoE with a 1M+ token context window and strong coding.

Released 2026-02-12 · 671B MoE · 1M tokens context · knowledge cutoff 2025-09

Overview

Major update with 10x context window expansion to over 1 million tokens

Specifications

DeveloperDeepSeek
Release date2026-02-12
Model typeLLM
Parameters671B MoE
ArchitectureSparse MoE (37B active / 671B total)
Context window1M tokens
Max output8K
Knowledge cutoff2025-09
LicenseDeepSeek License (open weights, commercial OK)
Input modalitiestext
Output modalitiestext

Benchmarks

Benchmark DeepSeek V3.2 DeepSeek-V3Δ
MMLU 90.1% 88.5% +1.6
HumanEval 92.5% 82.6% +9.9
Context Window 1M+ tokens
MATH 85.6% 61.6% +24.0
GPQA 68.4%
LiveCodeBench 72.1%

Pricing

$0.27
per 1M input tokens
$1.10
per 1M output tokens
$0.07
per 1M cached input tokens

Availability

Open Source DeepSeek API Hugging Face Together AI Fireworks AI

What's new vs DeepSeek-V3

Direct comparison available in the model comparison tool →

Official references