All Major Announcements

AWS re:Invent 2025 has arrived, and it's shaping up to be one of the most transformative cloud conferences in Amazon Web Services history. Running from December 1-5 at various venues across the Las Vegas Strip, including the Venetian and Caesars Forum, this year's conference brings together tens of thousands of developers, cloud architects, business leaders, and tech enthusiasts from around the globe.
The theme dominating this year's event is unmistakably Agentic AI – autonomous AI systems that can reason, make decisions, and take action independently. With over 2,300 specialized learning sessions, five major keynotes, and groundbreaking product announcements, AWS is making a bold statement about the future of cloud computing and artificial intelligence.
This comprehensive guide covers all the major announcements and innovations unveiled at re:Invent 2025, organized by category to help you understand how these developments can transform your business.
Agentic AI: The Revolution is Here
The central theme of re:Invent 2025 is Agentic AI, representing a fundamental shift from AI that merely responds to prompts to AI that can autonomously plan, execute, and complete complex multi-step tasks. AWS CEO Matt Garman emphasized during his keynote that this technology could unlock billions in productivity gains across industries ranging from healthcare to finance.
Amazon Bedrock AgentCore
Amazon Bedrock AgentCore has emerged as the cornerstone of AWS's agentic AI strategy. This comprehensive platform includes seven core services designed to help enterprises deploy and operate secure AI agents at scale:
- AgentCore Runtime: Managed compute environment for running AI agents with consumption-based pricing
- AgentCore Gateway: Integration layer that transforms existing APIs, Lambda functions, and services into agent-compatible tools
- AgentCore Browser: Enables AI agents to interact with web interfaces programmatically
- AgentCore Code Interpreter: Allows agents to write and execute code for complex problem-solving
- AgentCore Memory: Short-term and long-term memory management for context-aware interactions
- AgentCore Observability: Real-time monitoring and debugging with OpenTelemetry compatibility
- Model Context Protocol (MCP) Support: Integration with services like Amazon EKS for context-aware Kubernetes workflows and secure agent-to-agent communication
AWS Transform with Agentic AI
AWS Transform has received significant agentic AI enhancements that help companies modernize any code and application, including custom programming languages specific to their organization. Key capabilities include:
- Full-stack Windows modernization across .NET apps, SQL Server, UI frameworks, and deployment layers
- Up to 70% reduction in maintenance and licensing costs
- Air Canada has already used the service to modernize thousands of Lambda functions in just days, achieving an 80% reduction in time and cost
Amazon Connect Agentic Self-Service
Amazon Connect, AWS's cloud contact center service that recently crossed $1 billion in annual revenue, is receiving a major agentic AI upgrade. The new capabilities enable AI agents to understand, reason, and act across both voice and messaging channels.
Using advanced speech models, these agents can now speak with natural pacing and tone, collaborating with human agents rather than replacing them. The system listens to calls in real-time and actively helps human representatives by preparing documents or suggesting next steps.
Real-world impact: Lyft has achieved an 87% reduction in average resolution time for customer and driver support requests, with more than half resolved in less than three minutes.
AI Models and Services
Amazon Bedrock Enhancements
Amazon Bedrock continues to evolve as AWS's premier managed service for building and scaling generative AI applications. Key announcements include:
- New Service Tiers: Priority, Standard, and Flex tiers allow organizations to optimize AI workload costs by matching performance requirements with pricing
- Marengo 3.0 on Bedrock: TwelveLabs' video foundation model that understands full scenes, turning previously unusable video archives into searchable, structured insight. AWS is the first cloud provider to offer this model
- Amazon Nova Multimodal Embeddings: Industry's first embedding model supporting text, documents, images, video, and audio through a single unified model
- Claude Sonnet 4.5: Anthropic's latest model with advanced coding capabilities and agentic AI features now available in Bedrock
- Amazon Nova Web Grounding: Built-in tool for Nova models that automatically retrieves and grounds responses with web content
Amazon SageMaker AI Updates
Amazon SageMaker AI continues to receive significant enhancements:
- One-click onboarding with notebooks featuring built-in AI agents in Amazon SageMaker Unified Studio
- New business metadata features in Amazon SageMaker Catalog to improve discoverability
- Deepgram integration for streaming speech-to-text, text-to-speech, and voice agent capabilities with sub-second latency
- Simplified developer access with 'aws login' command
Amazon CloudWatch AI Observability
Amazon CloudWatch now offers comprehensive observability for generative AI applications and agents, providing built-in insights into latency, token usage, and errors across your AI stack. This capability works seamlessly with Amazon Bedrock AgentCore and is compatible with open-source agentic frameworks like LangChain, LangGraph, and CrewAI.
Compute and Infrastructure
Custom Silicon: Trainium and Project Rainier
AWS continues to invest heavily in custom AI silicon. Trainium2 is now fully subscribed and has become a multi-billion dollar business growing 150% quarter-over-quarter. Key developments include:
- Project Rainier: One of the world's largest AI compute clusters, now operational with nearly half a million Trainium2 chips, with plans to scale to over one million chips by end of 2025
- Trainium3 Preview: Next-generation AI chip expected in late 2025, featuring 40% better performance and energy efficiency, built on a cutting-edge 3nm process
- Trn2 UltraServers: Capable of scaling up to 83.2 peak petaflops, designed for training AI models with over a trillion parameters
- Anthropic Partnership: Deep collaboration with Anthropic, which has chosen AWS as its primary cloud provider for training its Claude models
AWS Lambda Managed Instances
A significant serverless innovation, AWS Lambda Managed Instances allows customers to run Lambda functions on EC2 compute while maintaining serverless simplicity. This enables access to specialized hardware and cost optimizations through EC2 pricing models, with AWS handling all infrastructure management.
Amazon EKS Capabilities
New Amazon EKS capabilities for workload orchestration and cloud resource management streamline Kubernetes development with fully managed platform capabilities. Key features include:
- New Provisioned Control Plane for enhanced performance
- Fully managed MCP servers (preview)
- Enhanced AI-powered troubleshooting in the console
- Support for ultra-scale clusters of up to 100,000 nodes
New EC2 Instance Types
AWS announced the EC2 P6-B300 instances for accelerating large-scale AI applications, along with new instance types featuring Intel Xeon Scalable (Granite Rapids), AMD EPYC (Turin), and AWS Graviton processors.
Networking and Multicloud
AWS Interconnect - Multicloud with Google Cloud
In a groundbreaking move, AWS and Google Cloud have jointly engineered a multicloud networking solution that enables customers to establish private, high-bandwidth connectivity between the two cloud providers. This represents a significant evolution in cloud competition and collaboration.
Key features of AWS Interconnect - multicloud:
- Fully managed, cloud-to-cloud experience provisioned quickly through the AWS Management Console or API
- Pre-built capacity pools allowing organizations to create connections and adjust bandwidth as needed
- Built-in resiliency and streamlined support
- Open API package published on GitHub for other service providers to adopt
Amazon Route 53 Global Resolver
Now in preview, Amazon Route 53 Global Resolver provides secure anycast DNS resolution, simplifying hybrid DNS management with a unified service that resolves public and private domains globally while reducing operational overhead and maintaining consistent security controls.
Security and Developer Tools
IAM Policy Autopilot
AWS has released IAM Policy Autopilot, a new open-source MCP server that analyzes code to generate valid IAM policies. This tool provides AI coding assistants with up-to-date AWS service knowledge and reliable permission recommendations, significantly speeding up AWS development.
AWS Clean Rooms Privacy Enhancement
AWS Clean Rooms now supports privacy-enhancing synthetic dataset generation for ML model training. Organizations can train ML models on sensitive collaborative data by generating synthetic datasets that preserve statistical patterns while protecting individual privacy through configurable noise levels and protection against re-identification.
Kiro: Generally Available
Kiro, the first AI coding tool built around spec-driven development, is now generally available. Since its preview release, over 250,000 developers have embraced the tool. The GA launch introduces:
- Property-based testing for spec correctness
- New checkpointing capabilities
- Kiro CLI bringing agents to your terminal
- Enhanced agentic workflows for structured development
Strategic Partnerships
Visa Intelligent Commerce
Visa and AWS announced a major collaboration to enable AI agents to securely complete multi-step transactions, from shopping to price tracking to payments. The companies will publish open blueprints on the Amazon Bedrock AgentCore repository for retail shopping, travel booking, and payment reconciliation.
Partners reviewing blueprint designs include Expedia Group, Intuit, lastminute.com, and Eurostars Hotel Company. The collaboration envisions use cases like instructing an AI agent to "Buy me basketball game tickets if the price drops below $150."
BlackRock Aladdin on AWS
BlackRock confirmed that Aladdin, its industry-recognized investment management technology platform, will run on AWS infrastructure for US enterprise clients starting in the second half of 2026. This gives financial institutions greater flexibility in deploying risk modeling, analytics, and investment decision-making tools.
Other Key Partnerships
- Nissan: Deploying its Nissan Scalable Open Software Platform on AWS, achieving 75% faster testing with over 5,000 developers collaborating globally
- Deepgram: Integrating enterprise speech AI into SageMaker, Amazon Connect, and Amazon Lex
- Trane Technologies: Using AI to achieve nearly 15% energy reductions at Amazon Grocery fulfillment sites
- S&P Global: Using MCP integrations to enable clients to query complex financial data using AI agents
- CrowdStrike: Enhanced Falcon Next-Gen SIEM tool offered via AWS Marketplace with simplified deployment
- OpenAI: Multi-year strategic partnership with a $38 billion, 7-year commitment to run and scale workloads on AWS
Infrastructure Investments
AWS is making massive infrastructure investments to support AI workloads:
- $15 Billion Indiana Investment: Building new data center campuses in Northern Indiana to advance AI innovation
- $50 Billion Government Infrastructure: Expanding AI and supercomputing infrastructure for US government agencies, providing access to Amazon SageMaker AI, Amazon Bedrock, and Amazon Nova
- Fastnet Transatlantic Cable: Dedicated high-capacity cable connecting the US and Ireland
- Power Expansion: Added over 3.8 gigawatts of power in the past 12 months, with plans to double capacity by 2027
- Custom Liquid Cooling: Designed a completely custom liquid cooling system in just 11 months to support denser, more powerful AI chips
Customer Success Stories
Real-world deployments demonstrate the transformative potential of AWS's new capabilities:
- Lyft: 87% reduction in support resolution time using Claude-powered intent agents
- Zepz: 30% contact deflection while processing $16 billion in transactions
- TUI Group: Migrated 10,000 agents across 12 European markets, cutting operating costs by 10%
- UC San Diego Health: Integrated Epic EHR for self-service patient authentication
- Air Canada: Modernized thousands of Lambda functions in days with 80% time and cost reduction
Additional Announcements
AWS Partner Central in Console
AWS Partner Central is now available directly in the AWS Management Console, allowing partners to manage solutions, opportunities, and marketplace listings in one unified interface with enterprise-grade security.
Amazon Quick Suite
Amazon Quick Suite is a new agentic AI application designed to cut through fragmented information, siloed applications, and repetitive tasks. S&P Global clients can now query complex financial and energy data using AI agents embedded inside Quick Suite.
AWS Marketplace Updates
AWS Marketplace is adding AI-powered search and flexible pricing models to help customers piece together AI solutions from multiple vendors, making it easier to build comprehensive AI stacks.
Agentic AI Competency Program
AWS has launched a new "Agentic AI" competency program for partners, designed to recognize firms building autonomous systems rather than simple chatbots.
Key Takeaways for 2025 and Beyond
- Agentic AI is Production-Ready: Companies are already deploying AI agents that reason, decide, and act autonomously, with measurable ROI across industries
- Custom Silicon is Strategic: Trainium is becoming central to AWS's AI strategy, with Trainium3 promising even better price-performance ratios
- Multicloud is Embraced: The AWS-Google Cloud partnership signals that interoperability is becoming a competitive advantage, not a weakness
- Enterprise AI is Mainstream: With companies like BlackRock, Visa, and Nissan making major commitments, enterprise AI adoption is accelerating
- Developer Experience Matters: Tools like Kiro, IAM Policy Autopilot, and enhanced observability show AWS's commitment to making AI development accessible
