On the heels of announcing Nova Forge, a service for training custom Nova AI models, Amazon Web Services (AWS) announced more tools for enterprise customers to create their own Frontier models.
AWS announced new features for Amazon Bedrock and Amazon SageMaker AI at the AWS re:Invent conference on Wednesday. These new features are designed to make it easier for developers to build and fine-tune custom large language models (LLMs).
Ankur Mehrotra, general manager of AI platforms at AWS, told TechCrunch in an interview that the cloud provider is introducing serverless model customization to SageMaker, allowing developers to start building models without having to think about compute resources or infrastructure.
To access these serverless model building capabilities, developers can follow a self-guided point-and-click path or an agent-driven experience that allows them to prompt SageMaker using natural language. Agent-driven features start in preview.
“If you are a healthcare customer and you want a model that can better understand certain medical terminology, all you have to do is point it to SageMaker AI. If your data is labeled, you select a method and SageMaker launches and fine-tunes the model,” Mehrotra said.
This feature is available to customize Amazon’s own Nova models as well as certain open source models such as DeepSeek and Meta’s Llama (models with publicly available model weights).
AWS is also launching Reinforcement Fine-Tuning in Bedrock, which allows developers to choose between reward functions or preconfigured workflows, and Bedrock automatically runs the model customization process from start to finish.
tech crunch event
san francisco
|
October 13-15, 2026
Frontier LLM (which stands for state-of-the-art AI models) and model customization appear to be areas of focus for AWS at this year’s conference.
During AWS CEO Matt Garman’s keynote on Tuesday, AWS announced Nova Forge, a service that allows AWS to build custom Nova models for enterprise customers for $100,000 per year.
“Many of our customers are asking, ‘How can we differentiate when the same model is available to our competitors?'” Mehrotra says. “‘How can we build a unique, optimized solution that optimizes our brand for our data, our use case? And how can we differentiate ourselves?’ What we’ve discovered is that the key to solving that problem is being able to create customized models.”
AWS has not yet gained a substantial user base for its AI models. Menlo Ventures’ July survey found that businesses significantly prefer Anthropic, OpenAI, and Gemini over other models. However, the ability to customize and fine-tune these LLMs could start to give AWS a competitive advantage.
Follow all of TechCrunch’s coverage of the annual enterprise tech event here and catch all the announcements you missed here.
