Latest with AI
Thursday, April 16, 2026
Breaking

Anthropic Begins Exploring Custom AI Chip Design as Revenue Surges Past $30 Billion

Anthropic is in the early stages of exploring custom AI chip design, joining Meta, OpenAI, and Apple in a broader industry movement toward proprietary silicon as the company's revenue surpasses $30 billion.

Picsum ID: 142

Anthropic is exploring the possibility of designing its own artificial intelligence chips, according to a Reuters report citing people familiar with the matter — joining a growing wave of AI companies moving toward custom silicon to reduce dependence on Nvidia and gain greater control over their hardware stack. The effort remains in its earliest stages, but it signals that Anthropic’s rapid growth is pushing the company toward vertically integrating its infrastructure in ways that were unthinkable just a year ago.

A Fast-Growing Appetite for Compute

The exploration comes as Anthropic’s business has scaled at a breakneck pace. The company’s annualized revenue run rate now exceeds $30 billion — up from roughly $9 billion at the end of 2025 — driven primarily by the popularity of its Claude Code coding agents and enterprise tools. That growth rate creates an obvious strategic rationale: at sufficient scale, the economics of owning your compute infrastructure can dwarf the flexibility of buying it from others.

The company has not committed to a specific chip design or assembled a dedicated team for the project, and sources indicate Anthropic may ultimately decide to continue purchasing chips rather than building its own. A company spokesperson declined to comment.

An Industry-Wide Trend

Anthropic’s internal discussions echo a broader push across the AI industry. Meta has been developing its own Training and Inference Accelerator chips. OpenAI is working with Broadcom on a custom processor targeting mass production this year. Apple has its Baltra AI server chip in advanced development. The pattern is clear: companies that reach sufficient scale in AI workloads eventually find it economical to optimize their hardware rather than rely on general-purpose solutions.

Building the Infrastructure Layer

Anthropic’s most significant recent compute commitment came earlier this month: a long-term deal with Google and Broadcom for approximately 3.5 gigawatts of next-generation TPU capacity expected to come online starting in 2027. Combined with its multiyear agreement with CoreWeave and partnerships across AWS and Microsoft Azure, Anthropic is building one of the most diversified AI compute portfolios in the industry — a strong foundation for the eventual in-house silicon ambitions now beginning to take shape.