Dwarkesh Patel

Dwarkesh Patel

1.2 M subscribers

Deeply researched interviews

9 summaries available YouTube ← All channels
Videos Channels Newsletter
The math behind how LLMs are trained and served – Reiner Pope
2:13:41
Dwarkesh Patel Dwarkesh Patel

The math behind how LLMs are trained and served – Reiner Pope

Reiner Pope explains the mathematical mechanics behind LLM inference costs, demonstrating how 'Fast Mode' APIs charge premiums for smaller batch sizes that reduce latency, and why physical memory bandwidth constraints create hard limits on how fast or cheap inference can get regardless of budget.

10 days ago · 9 points
Michael Nielsen – How science actually progresses
2:03:04
Dwarkesh Patel Dwarkesh Patel

Michael Nielsen – How science actually progresses

Michael Nielsen dismantles the pop-science narrative of linear scientific progress through crisp experiments, revealing instead a messy, decentralized process where mathematical formalism often precedes conceptual understanding, expertise can blind researchers to truth, and communities adopt paradigm shifts long before experimental closure.

about 1 month ago · 10 points
Terence Tao – Kepler, Newton, and the true nature of mathematical discovery
1:23:44
Dwarkesh Patel Dwarkesh Patel

Terence Tao – Kepler, Newton, and the true nature of mathematical discovery

Mathematician Terence Tao compares Kepler's twenty-year process of testing random hypotheses against Tycho Brahe's dataset to modern AI capabilities, arguing that while artificial intelligence has eliminated the bottleneck of idea generation in science, it has simultaneously created an unprecedented crisis in verification and validation that current peer review systems cannot handle.

about 2 months ago · 8 points
Dylan Patel — The Single Biggest Bottleneck to Scaling AI Compute
2:31:04
Dwarkesh Patel Dwarkesh Patel

Dylan Patel — The Single Biggest Bottleneck to Scaling AI Compute

Dylan Patel explains that Big Tech's $600B CapEx represents multi-year pre-purchases of power and data centers through 2029, while AI labs face an immediate crunch where Anthropic's conservative compute strategy forces them to pay massive premiums on spot markets compared to OpenAI's aggressive long-term contracting.

about 2 months ago · 9 points
Dario Amodei — The highest-stakes financial model in history
2:22:20
Dwarkesh Patel Dwarkesh Patel

Dario Amodei — The highest-stakes financial model in history

Dario Amodei argues that AI capabilities are progressing along the expected exponential curve and are nearing the end of that rapid growth phase, with models likely to achieve expert-level coding within 1-2 years and 'country of geniuses' level capabilities within 10 years, despite public distraction from this reality.

3 months ago · 9 points
Elon Musk – "In 36 months, the cheapest place to put AI will be space”
2:49:46
Dwarkesh Patel Dwarkesh Patel

Elon Musk – "In 36 months, the cheapest place to put AI will be space”

Elon Musk argues that terrestrial power constraints will make Earth-based AI data centers economically unviable at scale within 36 months, predicting that orbital data centers powered by space-based solar will become the cheapest solution due to unlimited energy availability, higher solar efficiency, and regulatory arbitrage, requiring massive investments in Starship launches and domestic chip manufacturing.

3 months ago · 10 points