Big Tech's Big Tobacco Moment

Offline with Jon FavreauApril 4, 202658:56Alpha 10.0
tech-regulationsocial-mediaartificial-intelligencechild-safetysection-230
Golden Quote
If you're telling a platform what it can and can't recommend, does that start to feel like regulating speech? Is that like telling a newspaper or a TV news program which stories they can air and which they can't?

Dan Pfeiffer

0:42

Synopsis

Two landmark jury verdicts — one in Los Angeles finding Meta and YouTube negligent for harming a teenage girl's mental health, another in New Mexico finding Meta violated consumer protection law by exposing children to predators — have cracked the legal shield that protected Big Tech for 30 years. For the first time, plaintiffs successfully argued that social media platforms are *defective products*, not just neutral content hosts, bypassing Section 230 by targeting design features like infinite scroll, autoplay, and algorithmic push notifications. New Mexico AG Raul Torres, who won his case after an undercover sting instantly flooded a fake 13-year-old's profile with sexual predators, reveals his May hearing strategy: forcing Meta to implement real age verification and algorithm changes that could set a national blueprint. With 2,000 similar lawsuits now unblocked and a 1,600-plaintiff federal case launching this summer, any professional whose business touches tech, media, regulation, or child welfare needs to understand why this litigation wave may finally do what Congress hasn't.

Speakers

Casey Newton
Dan Pfeiffer
Jon Favreau
Raul Torres

Episode Breakdown

Jon Favreau introduces landmark legal verdicts against Meta and YouTube, detailing how juries found the companies negligent in designing addictive platforms harmful to children, citing internal research and specific cases.

Once a juror understands that a company has been researching this, and that the more they looked into it, the worst stuff they found, and then also that research gets canceled or the researchers get moved to other projects, it does start to feel like a big tobacco moment.

This quote draws a powerful, controversial analogy between social media companies and the tobacco industry, suggesting a deliberate cover-up of harm.

Casey Newton
1:39
Instagram is a drug. We're basically pushers. We are causing reward deficit disorder because people are binging on Instagram so much, they can't feel reward anymore.

This shocking internal admission from Meta researchers, quoted here, likens their product to an addictive substance and themselves to drug pushers, revealing a deep ethical conflict.

Jon Favreau
3:22
What matters is that Meta and the rest of the social media giants have now lost the legal shield that has protected them for 30 years. Because Kaylee didn't sue them over the content on their platforms. She sued them because their platforms are defective. Because the product's design isn't safe for all users, especially children.

This highlights a monumental legal shift, arguing that recent verdicts are moving responsibility from user-generated content to platform design, potentially forcing fundamental changes in the tech industry.

Jon Favreau
6:36
For the first time, these verdicts might finally force tech giants to do what no one else has been able to make them do: fix the design. Make it safer. Get rid of social media's most addictive, harmful features: infinite scroll, autoplay, push notifications, beauty filters, even algorithmic recommendations. This is all on the table now for these juries and judges.

This bold prediction outlines concrete, specific features of social media platforms, like infinite scroll and algorithmic recommendations, that are now vulnerable to legal challenge and potential forced removal.

Jon Favreau
6:58