Dylan Patel — The Single Biggest Bottleneck to Scaling AI Compute
TL;DR
Dylan Patel explains that Big Tech's $600B CapEx represents multi-year pre-purchases of power and data centers through 2029, while AI labs face an immediate crunch where Anthropic's conservative compute strategy forces them to pay massive premiums on spot markets compared to OpenAI's aggressive long-term contracting.
⚡ CapEx Deployment Reality 3 insights
Multi-year pre-purchasing dominates Big Tech spending
The $600 billion Big Tech CapEx includes turbine deposits for 2028-2029 and data center construction for 2027, meaning much of the spending secures future capacity rather than immediate deployment.
Only 20 gigawatts deploys this year despite massive investment
Approximately 20 gigawatts of AI compute capacity will come online in the US this year, far below the 50 gigawatts implied by annual rental pricing models of the $600 billion spend.
Power agreements lock in years ahead
Hyperscalers like Google are placing down payments on power purchasing agreements and infrastructure years in advance to secure the fastest possible scaling trajectory.
🏃 The Compute Scramble 3 insights
Anthropic faces a 5-gigawatt shortfall by year-end
Anthropic currently operates 2-2.5 gigawatts but requires 5-6 gigawatts by December to support projected revenue growth and research needs, while OpenAI has secured significantly more capacity through aggressive deals.
Conservative strategy forces premium spot market buying
Anthropic must now source capacity from lower-tier neoclouds and spot markets at rates up to $2.40 per hour, compared to the $1.40 cost basis, effectively paying 50% margins for last-minute access.
OpenAI's funding validates aggressive pre-commitment
OpenAI's $110 billion raise validated their strategy of signing long-term contracts across Microsoft, Google, Amazon, CoreWeave, Oracle, and SoftBank, while Anthropic's conservative $30 billion approach leaves them capacity-constrained.
📈 GPU Value & Depreciation 3 insights
GPU value appreciates rather than depreciates
Contrary to bear predictions of rapid obsolescence, H100 GPUs may increase in value over time because newer models like GPT-5.4 extract significantly more performance per chip than previous generations.
Supply constraints override technology curves
Rental prices for H100s remain elevated at $2.00-$2.40 per hour despite newer Blackwell chips entering the market because deployment bottlenecks limit actual supply while demand grows exponentially.
AGI would collapse depreciation timelines
If artificial general intelligence is achieved, the utility value of individual GPUs could justify repayment periods of just months rather than years, fundamentally altering standard technology depreciation models.
Bottom Line
AI labs must commit aggressively to long-term compute contracts immediately or risk paying prohibitive premiums for spot capacity, as hardware value appreciates with model capabilities rather than depreciating on standard technology curves.
More from Dwarkesh Patel
View all
David Reich – Why the Bronze Age was an inflection point in human evolution
Geneticist David Reich reveals that contrary to decades of evolutionary theory, natural selection has been rampant in human populations over the last 10,000 years, with the Bronze Age triggering an unprecedented acceleration in genetic adaptation to immune and metabolic challenges.
The math behind how LLMs are trained and served – Reiner Pope
Reiner Pope explains the mathematical mechanics behind LLM inference costs, demonstrating how 'Fast Mode' APIs charge premiums for smaller batch sizes that reduce latency, and why physical memory bandwidth constraints create hard limits on how fast or cheap inference can get regardless of budget.
Jensen Huang – TPU competition, why we should sell chips to China, & Nvidia’s supply chain moat
Jensen Huang explains how Nvidia's 'electrons to tokens' full-stack ecosystem and massive supply chain commitments create a durable moat against commoditization and TPU competition, while arguing that AI agents will exponentially increase software tool usage rather than replace it.
Michael Nielsen – How science actually progresses
Michael Nielsen dismantles the pop-science narrative of linear scientific progress through crisp experiments, revealing instead a messy, decentralized process where mathematical formalism often precedes conceptual understanding, expertise can blind researchers to truth, and communities adopt paradigm shifts long before experimental closure.