Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
In this episode, Cal Newport examines whether OpenAI's new Sora video generation app could kill traditional social media platforms. (00:38) The Sora app allows users to create AI-generated videos simply by describing them in words, operating like a TikTok-style feed where users can create, share, and view algorithmically curated content. Newport argues that by chasing TikTok's engagement-based model, traditional platforms like Facebook, Instagram, and Twitter have abandoned their core competitive advantages - their carefully constructed social graphs and specific user bases. (02:36) This shift has made them vulnerable to AI-powered alternatives that can generate more engaging content without the limitations of human creators.
Cal Newport is a computer science professor at Georgetown University and bestselling author of books including "Digital Minimalism" and "Deep Work." He is known for his expertise on technology's impact on productivity and well-being, and regularly writes for The New Yorker on technology and culture topics.
Social media platforms should retreat from TikTok-style algorithmic feeds and return to their original strengths. (10:48) Newport argues that Facebook, Instagram, and Twitter each built unique competitive moats through their social graphs and user bases, but abandoned these advantages when chasing TikTok's engagement model. Facebook's strength was connecting people they actually knew, Instagram excelled at curating expert creators producing visually interesting content, and Twitter became a distributed curation machine for cultural zeitgeist. By returning to these models, platforms can offer something AI cannot replicate - genuine human connections and carefully curated relationships that users have invested time building.
The computational cost of generating AI videos creates a natural advantage for traditional social media platforms. (29:09) Newport explains that creating Sora videos requires expensive GPU computation and significant electricity usage, making it costly for users ($20-200 per month for meaningful video creation). In contrast, platforms like TikTok distribute creation costs to users' phones and only handle compressed video storage and delivery. This economic reality means AI platforms will have limited content inventory compared to traditional platforms, potentially weakening their recommendation algorithms' effectiveness.
Parents should treat digital communication as a privilege with clear boundaries rather than an ownership right. (80:45) Newport advocates for family-owned phones for logistics rather than personal devices, and treating group messaging like television time with designated periods and supervision. This approach prevents the exhaustion that comes from constant digital social interaction while still allowing necessary communication. The key insight is that digital sociality lacks the natural social guardrails that evolved for in-person interaction, making it more prone to negative interactions and mental exhaustion.
Any platform built solely on capturing attention faces inevitable competition from newer, more engaging alternatives. (13:32) Newport explains that when platforms compete purely on brainstem engagement rather than unique value propositions, they enter a race with no sustainable winner. There will always be new technologies that can stimulate attention more effectively, whether through AI-generated content, gambling elements, or other addictive features. Platforms with specific purposes and curated communities can maintain competitive advantages that pure engagement cannot replicate.
Manual chains of trust through personal networks provide superior content curation compared to algorithmic systems. (59:35) Newport argues that following recommendations from people you trust, who recommend people they trust, creates more reliable and valuable content discovery than machine learning algorithms. This "web of trust" model filters out bad actors and low-quality content while surfacing genuinely valuable information from credible sources. This approach requires more effort but results in higher-quality information consumption and helps avoid the manipulation and misinformation common in algorithmic feeds.