Stanford's Code in Place Info Session with Mehran Sahami

| Podcasts | May 04, 2026 | 494 views | 55:37

TL;DR

Stanford professors Mehran Sahami and Chris Peach present Code in Place, a free 6-week global Python program achieving 50-60% completion rates—over 10x higher than typical online courses—by pairing thousands of volunteer section leaders with small student cohorts for personalized, human-centric instruction.

🌍 Origins & Global Mission 3 insights

Pandemic-born educational initiative

Created in 2020 during COVID-19 to combat loneliness and education inequality by transforming 'shelter in place' into 'code in place,' offering accessible programming education during global lockdowns.

Massive worldwide scale

To date, the program has taught over 60,000 students from 150+ countries with the help of more than 5,500 volunteer section leaders, serving diverse demographics across all age groups.

Completely free access

The program removes financial barriers to elite computer science education, offering half to two-thirds of Stanford's introductory CS106A curriculum at no cost to students globally.

👥 The Section Leader Model 3 insights

Human connection drives completion

Small cohorts of approximately 10 students meet weekly with trained section leaders, achieving 50-60% completion rates compared to the 5% typical of traditional MOOCs through personalized accountability.

Teaching as learning

The program operates on the philosophy that human teaching potential is underutilized and that teaching reinforces learning, creating a joyful, reciprocal educational experience for volunteers.

Structured volunteer training

Section leaders receive comprehensive support including welcome meetings, two live practice sessions, and ongoing guidance before teaching their six weekly sections during the 6-week program.

💻 Curriculum & Learning Experience 3 insights

Industry-standard Python instruction

Students learn Python—the world's most popular programming language for AI and data science—through video lectures mirroring actual Stanford classes and a browser-based coding environment requiring no software installation.

Zero prerequisites required

The only requirements are basic computer literacy (using a mouse and web browser); the course assumes no prior programming knowledge and teaches coding from the ground up.

Multi-layered support system

Beyond weekly 50-minute live sections, students access online readings, worked programming examples, and vibrant discussion forums staffed by section leaders, head TAs, and professors.

Bottom Line

Join Code in Place either as a complete beginner seeking to learn Python through human connection rather than isolated video watching, or as a section leader to reinforce your own coding skills while helping others—either way, you'll participate in a scalable educational model that proves personalized teaching transforms online learning outcomes.

More from Stanford Online

View all
Stanford CS153 Frontier Systems | Andreas Blattmann from Black Forest Labs on Visual Intelligence
1:01:14
Stanford Online Stanford Online

Stanford CS153 Frontier Systems | Andreas Blattmann from Black Forest Labs on Visual Intelligence

Andreas Blattmann, co-founder of Black Forest Labs and co-creator of Stable Diffusion, argues that visual intelligence represents the critical next frontier for AI, requiring a fundamental shift from text-centric unimodal models to multimodal systems trained on 'natural representations' (video, audio, physics) to unlock true reasoning, robotics capabilities, and higher intelligence.

about 12 hours ago · 9 points
Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 9: Scaling Laws
1:17:57
Stanford Online Stanford Online

Stanford CS336 Language Modeling from Scratch | Spring 2026 | Lecture 9: Scaling Laws

This lecture introduces scaling laws as predictive power-law relationships that enable practitioners to optimize language model training on small budgets and confidently extrapolate performance to million-dollar large-scale runs, while tracing these empirical patterns back to classical machine learning theory and sample complexity research from the 1990s.

4 days ago · 9 points