AIIndustry

Amazon's Custom AI Chips Now Generate Over $20 Billion a Year — and Jassy Hints They Could Be Sold Externally, Challenging NVIDIA

Mubboo Editorial Team

Mubboo Editorial Team

April 12, 2026 · 4 min read

Buried inside Andy Jassy's annual shareholder letter — the one everyone noticed for the $15 billion AWS AI revenue disclosure — was a second number that may matter more in the long run: Amazon's custom chip business is now generating over $20 billion in annualized revenue, growing at triple-digit rates. And Jassy hinted that Amazon may start selling those chips to third parties.

The chip business nobody expected

Amazon designs three custom silicon lines in-house. Graviton handles general compute workloads. Trainium powers AI training. Nitro manages security and networking. Together, they crossed $20 billion in annualized revenue as of Jassy's April 9 letter — doubling from the $10 billion milestone Amazon reported earlier in 2026 and growing at triple-digit year-over-year rates.

Demand is outstripping supply. Jassy disclosed that two large, unnamed customers attempted to purchase all of Amazon's available Graviton capacity for 2026. Amazon refused, choosing to keep the chips available across its broader customer base rather than lock them into exclusive deals. But the demand signal prompted a notable comment: "There's so much demand for our chips that it's quite possible we'll sell racks of them to third parties in the future," Jassy wrote.

If Amazon follows through, it enters direct competition with NVIDIA in the AI chip market — not as a startup with unproven silicon, but as a company already running these chips at hyperscale inside the world's largest cloud provider.

Why does this matter for the AI ecosystem?

NVIDIA dominates AI chip supply today. Amazon, Google with its TPU chips, and Microsoft with its Maia accelerators are all developing alternatives, but NVIDIA still commands the majority of AI training and inference compute. Amazon's position is different from a typical challenger because it already operates its custom chips at massive scale inside AWS. Customers do not need to build new infrastructure or rearchitect their workloads — they access Trainium and Graviton through the same AWS services they already use.

The usage numbers back up the demand story. Amazon Bedrock, the managed AI service that runs foundation models on AWS infrastructure including Trainium chips, processed more tokens in Q1 2026 than in all prior periods combined. Inference volumes were nearly doubling month-over-month through March. That growth feeds directly into custom chip utilization — every token processed on Trainium is a token not processed on NVIDIA hardware.

The AI chip market is shifting from single-supplier dependency toward a multi-vendor ecosystem. Amazon selling chips externally would accelerate that shift by giving companies outside AWS access to silicon that has already been validated at production scale.

The financial picture behind the bet

Amazon is investing approximately $200 billion in capital expenditure in 2026, the majority directed at AI infrastructure. The spending is visible in the company's cash flow: 2025 revenue grew 12 percent to $717 billion and operating income rose 17 percent to $80 billion, but free cash flow dropped from $38 billion to $11 billion as capex surged. The stock rose roughly 5 percent on the day of the letter's release.

Jassy addressed the spending directly: "We're not investing on a hunch." The custom chip business generating $20 billion in measurable revenue — not projections, not addressable market estimates — is the strongest evidence that Amazon's infrastructure bets are converting into real returns. The question is no longer whether Amazon can build competitive AI chips. It is whether it will sell them to the companies currently buying from NVIDIA.

Mubboo's Take

Amazon's custom chip business generating $20 billion in revenue is a signal that the cost of AI inference is headed down. When multiple companies — Amazon, Google, and eventually their third-party customers — compete on AI chip supply, the price of running AI models falls for everyone. For consumer platforms like Mubboo that depend on AI for content production, product comparison, and travel planning, cheaper inference means doing more with the same budget. The AI chip wars are not a hardware story. They are a story about how fast AI-powered consumer services become affordable enough to operate at scale.

AIIndustry
LinkedInX
Mubboo Editorial Team

Mubboo Editorial Team

The Mubboo Editorial Team covers the latest in AI, consumer technology, e-commerce, and travel.

Related articles

AIIndustry

Anthropic Launches Claude for Word — AI That Edits Your Contracts, Tracks Every Change, and Works Across Word, Excel, and PowerPoint Simultaneously

Anthropic released Claude for Word in public beta on April 10-11, bringing AI-native drafting and editing directly into Microsoft Word as a sidebar add-in. Every AI edit appears as a tracked change. The tool shares context across Word, Excel, and PowerPoint simultaneously — and Anthropic is explicitly targeting lawyers as its first audience.

4 min read·Apr 12, 2026
AILocalIndustry

AI Overtakes Cybersecurity as the Number One Priority for U.S. State Government CIOs — Ending a 12-Year Reign

For the first time in two decades of tracking, AI has displaced cybersecurity as the top priority for America's state chief information officers, according to NASCIO's 2026 survey of 51 state and territory CIOs. State legislators introduced over 1,000 AI-related bills in 2025. The shift means government services — from permit applications to healthcare access — are being reshaped by AI.

4 min read·Apr 11, 2026
AIShoppingIndustry

Amazon CEO Jassy Discloses AWS AI Revenue for the First Time: $15 Billion Run Rate and Growing 260 Times Faster Than Early AWS

In his annual shareholder letter on April 9, Amazon CEO Andy Jassy revealed AWS's AI services are generating a $15 billion annualized revenue run rate — the first time Amazon has ever disclosed AI-specific revenue. He defended the company's planned $200 billion in capital expenditure for 2026: 'We're not investing on a hunch.'

4 min read·Apr 10, 2026
AIIndustry

Anthropic Launches Claude Managed Agents — Enterprise AI Agents Go from Prototype to Production in Days Instead of Months

Anthropic launched Claude Managed Agents in public beta on April 9. The composable API suite handles sandboxing, state management, and security so developers can deploy production-grade AI agents in days. Notion, Rakuten, Asana, and Sentry are among the first to ship.

4 min read·Apr 10, 2026