Command Palette

Search for a command to run...

PodMine
Deep Questions with Cal Newport
Deep Questions with Cal Newport•September 22, 2025

Ep. 371: Is it Finally Time to Leave Social Media?

Cal Newport argues that curated conversation platforms like Twitter and Facebook are inherently harmful, creating a "slope of terribleness" that pulls users into distraction, demoderation, and potential disassociation, ultimately suggesting people should quit these platforms entirely.
Creator Economy
Digital Nomad Life
Tech Policy & Ethics
Cal Newport
Mark Zuckerberg
Charlie Kirk
Derek Thompson
Yuval Harari

Summary Sections

  • Podcast Summary
  • Speakers
  • Key Takeaways
  • Statistics & Facts
  • Compelling StoriesPremium
  • Thought-Provoking QuotesPremium
  • Strategies & FrameworksPremium
  • Similar StrategiesPlus
  • Additional ContextPremium
  • Key Takeaways TablePlus
  • Critical AnalysisPlus
  • Books & Articles MentionedPlus
  • Products, Tools & Software MentionedPlus
0:00/0:00

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.

0:00/0:00

Podcast Summary

In this comprehensive deep dive, Cal Newport presents his "slope of terribleness" framework to explain how social media platforms systematically harm users through three connected stages: distraction, demoderation, and disassociation. (03:58) Newport argues that these harms aren't isolated issues but form an inevitable downward spiral where users start with mild addiction, progress to tribal thinking and loss of empathy, and can ultimately reach dangerous levels of nihilism or rage that may lead to real-world violence.

• The episode focuses on "curated conversation platforms" like Twitter, Facebook, and Blue Sky - distinguishing them from other social platforms that don't combine algorithmic curation with mass conversation features (03:13)

Speakers

Cal Newport

Cal Newport is a computer science professor at Georgetown University and bestselling author of books including "Deep Work," "Digital Minimalism," and "Slow Productivity." He has been writing about technology's impact on productivity and well-being for over two decades through his popular newsletter and is a leading voice in the movement toward more intentional technology use. Newport has been analyzing social media's societal effects since the early days of Facebook, often receiving criticism for his contrarian stance that has now gained widespread acceptance.

Key Takeaways

Social Media Harms Are Connected, Not Isolated

Newport's key insight is that distraction, demoderation, and disassociation aren't separate problems but form an inevitable "slope of terribleness" where each stage pulls users deeper into dysfunction. (10:10) Most people think they only suffer from distraction while dismissing the other harms as affecting "other people," but gravity naturally pulls users down this slope. The farther down you slide, the more mental energy you must expend to prevent further descent, leaving less energy for meaningful activities and relationships in your life.

Algorithmic Curation Makes the Slope Inevitable

The combination of algorithmic curation and mass conversation creates unavoidable psychological effects. (16:42) Algorithms optimizing for engagement naturally create echo chambers by learning your preferences and showing increasingly similar, more extreme content. When combined with conversation features, this fires up ancient tribal community circuits in our brains - the same mechanisms that helped humans survive for 300,000 years by creating intense loyalty to our group and suspicion of others. These platforms essentially return us to "paleolithic" tribal thinking in a digital context.

The "Algorithmic Turn" Changed Everything

Social media wasn't always toxic - the problems began around 2012-2015 when platforms shifted from chronological feeds to algorithmic curation. (63:31) Early social media was more like "a cool hang" where you could follow interesting people and see their content in order. The algorithmic turn happened when companies went public and needed to maximize user engagement to increase ad revenue. This shift from user growth to engagement optimization is what activated the slope of terribleness and transformed these platforms from useful tools into attention-harvesting machines.

Resistance Requires Constant Energy That Could Be Better Spent

Even if you successfully avoid sliding to the bottom of the slope, stopping your descent requires significant mental energy that depletes your capacity for more meaningful pursuits. (25:19) You may arrest your fall somewhere in the demoderation zone, but you're now living at a lower level of human flourishing while constantly expending willpower to resist further decline. This represents a fundamental trade-off: why accept diminished well-being and waste mental resources fighting an inevitable pull when you could redirect that energy toward people and activities you actually value?

There Are No Technological Fixes for Fundamental Design Problems

The slope of terribleness isn't a bug that can be fixed - it's the core feature of how these platforms operate. (23:17) You cannot have algorithmic curation plus mass conversation without creating echo chambers and tribal dynamics. Asking these companies to fix the problem is like asking Pizza Hut to exist without selling pizza - it's literally their business model. The platforms need massive amounts of engaging content to monetize, and the most engaging content typically involves tribal conflict and emotional manipulation. Any "solution" that eliminates these dynamics would eliminate the fundamental service these companies provide.

Statistics & Facts

  1. The FBI began using a new term in recent years for mass violence incidents: "nihilistic violent extremist," indicating how common violence driven by nihilism has become. (06:46) This represents a new category of violence distinct from traditional ideological extremism.
  2. Over 500 million tweets are submitted daily on Twitter, necessitating algorithmic curation since no human system could meaningfully organize that volume of content for individual users. (23:35) This scale makes algorithmic filtering inevitable for these platforms.
  3. The "algorithmic turn" in social media occurred between 2012-2015, coinciding with when more than half of US adults began using smartphones. (63:32) This timing allowed platforms to shift from desktop-based occasional use to mobile-based constant engagement.

Compelling Stories

Available with a Premium subscription

Thought-Provoking Quotes

Available with a Premium subscription

Strategies & Frameworks

Available with a Premium subscription

Similar Strategies

Available with a Plus subscription

Additional Context

Available with a Premium subscription

Key Takeaways Table

Available with a Plus subscription

Critical Analysis

Available with a Plus subscription

Books & Articles Mentioned

Available with a Plus subscription

Products, Tools & Software Mentioned

Available with a Plus subscription

More episodes like this

In Good Company with Nicolai Tangen
January 14, 2026

Figma CEO: From Idea to IPO, Design at Scale and AI’s Impact on Creativity

In Good Company with Nicolai Tangen
We Study Billionaires - The Investor’s Podcast Network
January 14, 2026

BTC257: Bitcoin Mastermind Q1 2026 w/ Jeff Ross, Joe Carlasare, and American HODL (Bitcoin Podcast)

We Study Billionaires - The Investor’s Podcast Network
Uncensored CMO
January 14, 2026

Rory Sutherland on why luck beats logic in marketing

Uncensored CMO
This Week in Startups
January 13, 2026

How to Make Billions from Exposing Fraud | E2234

This Week in Startups
Swipe to navigate