YOU ARE AT:AI-Machine-LearningIndustrial AIAWS doubles up on enterprise AI with Anthropic, OpenAI deals

AWS doubles up on enterprise AI with Anthropic, OpenAI deals

AWS has expanded its AI offerings with Anthropic’s Claude 4 on Bedrock and OpenAI’s open weight models on Bedrock and SageMaker; they join a stable of closed and open LLMs on its cloud environment.

In sum – what to know:

Model variety – AWS is now offering Anthropic’s proprietary Claude 4 models and OpenAI’s new open models.
Model placement – they are all available in Bedrock; the open models are also in its SageMaker workshop.
Model advancement – provides support for generative and agentic AI workflows for enterprises. 

AWS has expanded its AI model offerings with twin deals – announced a day apart (August 5 and 6) – with Anthropic and OpenAI, offering the former’s latest Claude 4 models on Amazon Bedrock and the latter’s new “open weight” models via both Bedrock and SageMaker AI. AWS is doing its bit to position itself as the go-to cloud AI platform for enterprises to mix and match large language models (LLMs), open or proprietary, to build secure and scalable AI. 

The arrangement with Anthropic covers its latest proprietary Claude 4 models (Claude Opus 4.1 and Claude Sonnet 4) which are now available on the cloud firm’s Bedrock plug-and-play AI service, accessed as managed services via Bedrock’s API. The parallel deal covers OpenAI’s new open weight models (gpt-oss-120b and gpt-oss-20b) through both Bedrock and its SageMaker AI workshop platform. 

Both deals mean customers can build and run generative AI apps directly on AWS. The company is already offering open models on Bedrock from DeepSeek, Meta, and Mistral AI, as well. By comparison, Microsoft Azure has an exclusive with OpenAI’s closed models (GPT-4o, o1, etc.), plus in-house Microsoft 365 integration, and Google Cloud is pushing its own family-models (Gemini) and strength in search/data, plus its Vertex AI for multi-model access. 

Take your pick (from the big three), but AWS claims broad variety, mixing proprietary-cloud, open-weight, and hybrid models from leading AI providers. Opus 4.1 is billed as Anthropic’s most powerful model; Claude Sonnet 4 balances performance and cost for everyday tasks. Both support 200K token context, hybrid reasoning, and advanced agentic capabilities. OpenAI’s gpt-oss-120b and gpt-oss-20b claim advanced reasoning for agentic workflows – says its maker. 

They support 128K context, tool use, and real-time referencing. Closed models are more powerful and optimized for specific tasks. Open models give more flexibility on the grounds they can be deployed and modified on private infrastructure. Enterprises tend to opt for the former to protect IP and maintain control; they often go for open models for customization and research. SageMaker is where open models are pulled apart, and played with.

Bedrock hosts a portfolio of more than 100 models from leading AI companies, including fully-managed cloud exclusives from the likes of Luma AI, TwelveLabs, and Writer. Bedrock customers include DoorDash, GoDaddy, Lonely Planet, LexisNexis Legal & Professional, Pfizer, the PGA TOUR, Rocket Companies, Siemens, and TUI – “to modernize business processes, develop next-generation applications, and transform their enterprises”.

AWS said of the ANthropic tie-up: “Claude Opus 4.1 operates like a brilliant detailed-oriented collaborator that aces in agentic search and research, content creation, and memory and context management, allowing for comprehensive insight synthesis, high-quality content production, and effective summarization. Claude Sonnet 4 is efficient, creating a perfect blend of quick thinking and practical intelligence.”

Kate Jensen, head of growth and revenue at Anthropic, commented: “[These models] transform AI from a tool into a collaborator… Customers will see project timelines shrink – in many cases from weeks to hours. [They] set new standards in coding, advanced reasoning, and multi-step workflows… The real breakthrough is freeing your talent for strategic work while Claude handles the heavy lifting.”

Atul Deo, director of product at AWS, said of the OpenAI arrangement: “Open weight models are an important area of innovation in the future development of generative AI technology, which is why we have invested in making AWS the best place to run them… The addition of OpenAI… marks a natural progression… The unmatched size of our customer base marks a transformative shift in access to OpenAI’s advanced technology.”

Dmitry Pimenov, product lead at OpenAI, said: “Our open weight models help developers – from solo builders to large enterprise teams – unlock new possibilities across industries and use cases. Together with AWS, we’re providing powerful, flexible tools that make it easier than ever for customers to build, innovate, and scale.”

ABOUT AUTHOR

James Blackman
James Blackman
James Blackman has been writing about the technology and telecoms sectors for over a decade. He has edited and contributed to a number of European news outlets and trade titles. He has also worked at telecoms company Huawei, leading media activity for its devices business in Western Europe. He is based in London.