- The new OpenAi models perform effectively on minimal equipment, but have not been tested independently for workloads
- The models are designed for on -board use cases where large -scale infrastructure is not always available
- Apache 2.0 licenses can encourage broader experiment in regions with strict data requirements
OPENAI has published two open models, GPT-OSS-120B and GPT-OS-20B, positioning them as direct challengers to offers like Deepseek-R1 and other large-language learning models (LLM) currently shaping the IA ecosystem.
These models are now available on AWS via its Amazon Bedrock and Amazon Sagemaker AI platforms.
This marks the entry of Openai into the segment of the open weight model, a space which has so far been dominated by competitors such as Mistral AI and Meta.
OPENAI and AWS
The GPT-AS-15B model works on a single 80 GB GPU, while the 20B version targets EDGE environments with only 16 GB of memory required.
OPENAI says that the two models offer solid reasoning performance, corresponding or exceeding its O4-Mini model on key references.
However, external assessments are not yet available, leaving real performance through various workloads open to the exam.
What distinguishes these models is not only their size, but also the license.
Released under Apache 2.0, they are intended to reduce barriers to access and support the wider development of AI, especially in high -security or resources environments.
According to Openai, this decision aligned on its broader mission to make artificial intelligence tools more widely usable between industries and geographies.
On AWS, the models are integrated into the corporate infrastructure via Amazon Bedrock Agentcore, allowing the creation of AI agents capable of performing complex workflows.
OPENAI suggests that these models are suitable for tasks such as code generation, scientific reasoning and problem solving in several stages, especially when the adjustable reasoning and outputs of the thought chain are necessary.
Their 128K context window also supports longer interactions, such as documents analysis or technical support tasks.
The models are also integrated into the tools of developers, with support platforms such as VLLM, LLAMA.CPP and Hugging Face.
With features such as railings and future management for imports and knowledge on personalized models, Openai and AWS launch this as a ready base for developers to build evolutionary AI applications.
However, the version is partly strategic, positioning Openai as a key director in the infrastructure of the open model, while fixing its technology more closely at Amazon’s web services, a dominant force in Cloud Computing.