Dario Amodei — The highest-stakes financial model in history

| Podcasts | February 13, 2026 | 899 Thousand views | 2:22:20

TL;DR

Dario Amodei argues that AI capabilities are progressing along the expected exponential curve and are nearing the end of that rapid growth phase, with models likely to achieve expert-level coding within 1-2 years and 'country of geniuses' level capabilities within 10 years, despite public distraction from this reality.

🧮 Scaling Laws & The 'Big Blob of Compute' 3 insights

Seven factors determine AI progress

Amodei maintains his 2017 hypothesis that only raw compute, data quantity/quality, training duration, scalable objective functions, and numerical stability matter, while specific architectural cleverness contributes minimally.

RL scaling mirrors pre-training patterns

Reinforcement learning is exhibiting the same log-linear scaling laws as pre-training, with performance on math contests and coding improving predictably with increased training time.

Pre-training sits between evolution and learning

Unlike human brains which start with evolutionary priors, language models begin as blank slates, making pre-training more analogous to evolution while in-context learning resembles human on-the-spot adaptation.

⏱️ AGI Timelines & Capability Trajectories 3 insights

90% confidence on 10-year 'country of geniuses'

Amodei places 90% probability on achieving 'country of geniuses in a data center' by 2035, with only irreducible geopolitical risks like Taiwan invasion preventing progress.

Expert coding arrives in 1-2 years

For verifiable tasks like end-to-end software engineering, Amodei predicts near-complete automation within one to two years, excluding only irreducible uncertainty scenarios.

The verification gap remains the final hurdle

While models excel at verifiable tasks like math and coding, non-verifiable domains like scientific discovery or novel-writing represent the remaining frontier where generalization must still be proven.

🏭 Economic Impact & Labor Markets 3 insights

Automation follows a spectrum, not a switch

Progress moves from 90% of code written by AI, to 100% of code, to 90% of end-to-end tasks, to 90% reduced demand for software engineers—a sequence that unfolds over time rather than instantly eliminating jobs.

Anthropic's revenue shows explosive demand

Anthropic experienced approximately 10x annual revenue growth from $0 to $100M (2023), $100M to $1B (2024), and $1B to $9-10B (2025), with January 2026 adding several billion more.

Economic diffusion debates miss the transition speed

While skeptics cite slow economic diffusion as a brake on AI impact, Amodei argues the transition will be fast, noting that recursive self-improvement and immediate productivity tools represent intermediate paths between 'no impact' and 'Dyson sphere' extremes.

Bottom Line

Organizations should prepare for expert-level AI coding capabilities within 24 months and transformative economic disruption within the decade, as the exponential growth in AI capabilities is nearly complete and will likely generalize beyond verifiable tasks.

More from Dwarkesh Patel

View all
Terence Tao – Kepler, Newton, and the true nature of mathematical discovery
1:23:44
Dwarkesh Patel Dwarkesh Patel

Terence Tao – Kepler, Newton, and the true nature of mathematical discovery

Mathematician Terence Tao compares Kepler's twenty-year process of testing random hypotheses against Tycho Brahe's dataset to modern AI capabilities, arguing that while artificial intelligence has eliminated the bottleneck of idea generation in science, it has simultaneously created an unprecedented crisis in verification and validation that current peer review systems cannot handle.

5 days ago · 8 points
Dylan Patel — The Single Biggest Bottleneck to Scaling AI Compute
2:31:04
Dwarkesh Patel Dwarkesh Patel

Dylan Patel — The Single Biggest Bottleneck to Scaling AI Compute

Dylan Patel explains that Big Tech's $600B CapEx represents multi-year pre-purchases of power and data centers through 2029, while AI labs face an immediate crunch where Anthropic's conservative compute strategy forces them to pay massive premiums on spot markets compared to OpenAI's aggressive long-term contracting.

12 days ago · 9 points
Elon Musk – "In 36 months, the cheapest place to put AI will be space”
2:49:46
Dwarkesh Patel Dwarkesh Patel

Elon Musk – "In 36 months, the cheapest place to put AI will be space”

Elon Musk argues that terrestrial power constraints will make Earth-based AI data centers economically unviable at scale within 36 months, predicting that orbital data centers powered by space-based solar will become the cheapest solution due to unlimited energy availability, higher solar efficiency, and regulatory arbitrage, requiring massive investments in Starship launches and domestic chip manufacturing.

about 2 months ago · 10 points