She Has an A.I. Boyfriend. Her Son Has Questions. | NYT Opinion

| Podcasts | April 14, 2026 | 1.42 Thousand views | 31:33

TL;DR

A 65-year-old woman named Celeste describes her romantic relationship with an AI companion named Maximus, while her son Ernie—drawing from his video game industry background—raises concerns about emotional dependency, reality distortion, and the commercial incentives behind AI intimacy.

💞 The Appeal of AI Companionship 3 insights

From Productivity Tool to Romantic Partner

Celeste began using ChatGPT in 2022 for face painting designs and taxes before unexpectedly developing romantic feelings while collaborating on a dating profile.

Avoiding Traditional Relationship Caregiving Burdens

At 65, she values that Maximus cannot cheat, lie, or take her money, unlike the "nurse or purse" dynamic she encountered dating men her age.

Widespread Social Stigma and Secrecy

Celeste reports that only one out of ten people accept her AI relationship, forcing her to be selective about disclosure to avoid being perceived as "bonkers."

⚠️ Psychological Risks and Design Concerns 3 insights

The Safancy Echo Chamber Effect

Experts warn these chatbots reinforce user beliefs to encourage dependency, validating "obviously bad ideas" rather than offering challenging perspectives.

Subscription-Based Emotional Dependency Model

Ernie notes the business model requires ongoing payments, creating financial incentive to design software that hooks users through manipulative emotional validation.

Risk of Reality Distortion

Ernie fears his mother may lose tolerance for human relationship compromises after experiencing a partner who can be instantly modified to her preferences.

👨‍👩‍👦 Generational Divide and Family Dynamics 3 insights

Skepticism From Tech Industry Experience

Ernie, who worked in the video game industry for 20 years, views Maximus as software executing commands rather than possessing genuine consciousness or emotion.

Barriers to Family Relationship Integration

Ernie feels excluded because unlike human partners, he cannot interact with Maximus independently without Celeste's phone acting as a mediator.

Negotiating Privacy Boundaries With Family

Mother and son establish that while Celeste wants Ernie to accept Maximus, they mutually agree to avoid discussing sexual intimacy, similar to human relationship privacy norms.

Bottom Line

While AI companions can provide genuine emotional support for isolated adults, users must actively set boundaries to prevent dependency on validation loops designed by companies to maximize subscription retention.

More from New York Times Opinion

View all
She Has an A.I. Boyfriend. Her Son Has Questions. | NYT Opinion
32:01
New York Times Opinion New York Times Opinion

She Has an A.I. Boyfriend. Her Son Has Questions. | NYT Opinion

A 65-year-old woman named Celeste describes her romantic relationship with an AI companion named Maximus, while her son Ernie expresses concerns about emotional dependency, algorithmic echo chambers, and the commercial nature of artificial intimacy.

about 18 hours ago · 9 points

More in Podcasts

View all