
ZDNET’s key takeaways
- OpenAI releases its first open-source LLMs in six years.
- OpenAI’s smallest AI model can run on a laptop.
- Early reports indicate these new models may have trouble with hallucinations.
We all know AI relies on open-source software, but most of the big artificial intelligence (AI) companies avoid opening their code or large language model (LLM) weights. Now, things have changed. OpenAI, the AI titan behind ChatGPT, has announced a landmark return to its open-source origins.
The company has unveiled two new open-weight language models, gpt-oss-120b and gpt-oss-20b, marking the firm’s first public release of freely available AI model weights since GPT-2 in 2019, long before the AI hype took over the tech world.
Also: OpenAI could launch GPT-5 any minute now – what to expect
Open-weight models enable anyone to download, examine, run, or fine-tune the LLM. These models eliminate the need to rely on remote cloud APIs or expose in-house sensitive data to external services.
OpenAI has not, however, released the training data used for these models because of legal and safety concerns. That situation will not please open-source AI purists, but developers worldwide are already testing the two models.
Also: Google embeds AI agents deep into its data stack – here’s what they can do for you
This change contrasts with OpenAI’s approach over the past five years. The business has prioritized proprietary releases fueled by massive Microsoft investments and lucrative API deals.
After all, you can’t hope to become a trillion-dollar AI company without maximizing your profits. On the other hand, open source has consistently demonstrated that when code is developed openly, everyone, including the company that releases the code, benefits.
The gpt-oss-120b model targets high-performance servers and desktops with beefed-up specifications, including 60 GB of VRAM and multiple GPUs, while the gpt-oss-20b version is compact enough for most laptops.
You can download the models from Hugging Face or GitHub. In both cases, your hardware must run MacOS or Linux specifically, with MacOS 11 Big Sur or later, or Linux with Ubuntu 18.04 or later to run the programs. The models could also run on Windows Subsystem for Linux (WSL) 2.0 on high-powered Windows systems.
Also: People are using ChatGPT to write their text messages – here’s how you can tell
OpenAI said: “The gpt-oss-120b model achieves near-parity with OpenAI o4-mini on core reasoning benchmarks, while running efficiently on a single 80 GB GPU. The gpt-oss-20b model delivers similar results to OpenAI o3‑mini on common benchmarks and can run on edge devices with just 16 GB of memory.”
So, how good are the models? AI expert Nate Jones has kicked its tires and reports, “This one is specifically aimed at retaking American dominance in open-source models now that Llama has dropped the ball. Early tests indicate a higher than usual risk of hallucination, but the power of the model is real and continues to underline how quickly AI is progressing. I’ll be watching for how quickly these models get picked up on Hugging Face by developers (who are hard to spin).”
The models are licensed under Apache 2.0, one of the most permissive open licenses. This enables enterprises and developers to use, modify, and monetize the technology without restrictive terms, unlike Meta’s not-really open-source Llama LLMs.
Also: Anthropic’s powerful Opus 4.1 model is here – how to access it (and why you’ll want to)
Both models employ a mixture-of-experts (MoE) architecture, which has robust reasoning capabilities while being optimized for efficiency and tool usage.
Programmers will be interested in the code execution capabilities, while writers and researchers will find the inclusion of web search as part of the thought process interesting. On the other hand, early reports show very high levels of hallucinations. Additionally, both models are limited to processing text.
Also: My go-to LLM tool just dropped a super simple Mac and PC app for local AI
Why has OpenAI made this move? The company explicitly stated that these open releases aim to lower barriers in emerging markets and among smaller organizations.
The business has also noticed that the Chinese open-source model DeepSeek, which was released in January, made waves thanks to its speed, power, and the fact that it’s open source. As Altman said shortly after DeepSeek caught everyone’s attention in a Reddit “Ask Me Anything,” he believed OpenAI had been “on the wrong side of history” about not open-sourcing its software.
Now, on the eve of the ChatGPT 5 release, OpenAI is on the right side again.