What comes after smartphones, with Evan Spiegel of Snap

| Podcasts | April 27, 2026 | 8.26 Thousand views | 1:03:52

TL;DR

Evan Spiegel describes Snap's pivotal 2026 'crucible moment' as the company approaches one billion users and profitability while launching Specs—AR glasses that aim to shift computing from isolating screens to shared, heads-up spatial experiences after 12 years of development.

🚀 The Crucible Moment 3 insights

Billion-user scale approaching profitability

Snap is nearing one billion monthly active users and net income profitability simultaneously while managing a complete business transformation driven by artificial intelligence.

Consumer Specs launch after 12-year R&D cycle

After developing AR glasses since 2014, Snap will release Specs to consumers later this year, positioning the device between low-capability smart glasses and high-end VR headsets like Vision Pro.

AI-driven software development

More than two-thirds of new code at Snap is now written by AI, dramatically accelerating development across the organization while the company leverages network effects to insulate itself from AI disruption.

⚙️ Engineering & Technical Architecture 3 insights

Custom Linux OS for thermal efficiency

Specs run on a ground-up Linux operating system rather than Android, which Snap found too bloated for glasses, enabling better performance and power management without external compute packs.

Full vertical stack ownership

Snap controls every layer from Lens Studio developer tools and the Lens Core rendering engine to custom waveguide optics and micro-projectors, creating defensible IP and optimized integration.

Miniaturization breakthroughs

The device packs true spatial computing capabilities into a lightweight, wearable glasses form factor, solving miniaturization challenges that have historically required tethered hardware or backpack computers.

👓 Human-Centric Computing Philosophy 3 insights

From isolation to shared presence

Unlike smartphones that pull users into private screens, see-through glasses enable communal experiences—allowing friends to play chess, watch media, or build things together while maintaining real-world eye contact.

AI agent monitoring interface

The glasses serve as an ideal platform for overseeing autonomous AI agents via heads-up displays while remaining physically present, effectively 'AGI-proofing' human experiences by keeping users in the world while AI works remotely.

Sensory alignment with human biology

Positioning cameras and microphones at eye and ear level provides computers with human-aligned sensory input to better understand spatial contexts and facilitate genuine social interaction.

🎯 Market Strategy & Positioning 3 insights

Early adopter pricing strategy

The initial launch targets technology enthusiasts willing to pay premium prices, analogous to the 1984 Macintosh launch which cost the equivalent of $8,000, with broader consumer diffusion planned as manufacturing scales.

Net-new experiences over screen replacement

Rather than merely replacing TVs or laptops, the focus is on enabling previously impossible activities like backyard laser tag or collaborative 3D building that leverage the unique affordances of persistent spatial computing.

Deliberate rejection of camera-glasses niche

Snap avoids the 'GoPro competitor' market because point-of-view video isn't ten times better than smartphones and lacks platform potential, whereas true AR glasses create developer ecosystems through operating systems and world understanding.

Bottom Line

The next computing paradigm won't replace smartphones but will instead use lightweight AR glasses to enable shared, heads-up spatial experiences that keep users socially present in the physical world while seamlessly integrating AI agents into daily life.

More from Stripe

View all
The world of voice AI, with Mati Staniszewski of ElevenLabs
1:00:33
Stripe Stripe

The world of voice AI, with Mati Staniszewski of ElevenLabs

ElevenLabs CEO Mati Staniszewski explains how modern voice AI models predict phonemes with contextual awareness rather than using hard-coded parameters, enabling emergent properties like accents and emotions, while discussing the company's platform strategy and the deployment gap between capable models and consumer applications.

15 days ago · 10 points
The history and future of AI at Google, with Sundar Pichai
1:09:33
Stripe Stripe

The history and future of AI at Google, with Sundar Pichai

Sundar Pichai argues that Google's invention of Transformers and early work on LaMDa positioned it for the AI era, emphasizing that vertical integration—from TPUs to strict latency budgets—enables the company to treat AI as an expansionary force driving search toward agentic workflows rather than a zero-sum threat.

22 days ago · 9 points