SS
About Me
Frontier AI Paper BriefingsPokebowlClinical Trial EnrollerLittle Human Names
DisclaimersPrivacy PolicyTerms of Use
Privacy Policy·Terms of Use·Disclaimers

© 2026 Silvia Seceleanu

← Back to Explorer
Models·OpenAI·Aug 2025

36. Introducing gpt-oss

OpenAI goes open-weight for the first time since GPT-2

Product Announcement
Summary

Released gpt-oss-120b and gpt-oss-20b, OpenAI's first open-weight models since GPT-2 (2019), under Apache 2.0 license. The 120B MoE model matches o4-mini on reasoning benchmarks while fitting on a single H100; the 20B runs on consumer hardware with 16GB RAM.

Key Concepts

First open-weight models since GPT-2 — Apache 2.0 licensed MoE reasoning models

gpt-oss-120b (117B parameters) and gpt-oss-20b (21B parameters) are mixture-of-experts models released under the Apache 2.0 license. Both support instruction following, tool use, and reasoning — capabilities previously locked behind OpenAI's API.

120B matches o4-mini on reasoning benchmarks while fitting on a single H100 GPU

The gpt-oss-120b achieves near-parity with o4-mini on core reasoning benchmarks using 4-bit quantization (MXFP4), enabling deployment on a single 80GB H100 GPU. The 20B model runs on consumer hardware with just 16GB of memory.

Trained with RL and distilled from frontier models including o3

The models were trained using reinforcement learning and techniques informed by OpenAI's most advanced internal models, including o3 and other frontier systems. This represents a new approach to open-source: distilling frontier capabilities into deployable open-weight models.

Connections

36. Introducing gpt-…Aug 202534. GPT-5 / Codex CL…Aug 202537. GPT-5.2 / CodexDec 2025Influenced byInfluences
Influenced by
34. GPT-5 / Codex CLI / Research Agent
Aug 2025
Influences
37. GPT-5.2 / Codex
Dec 2025