Command Palette

Search for a command to run...

PodMine
"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis
"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis•December 19, 2025

AI 2025 → 2026 Live Show | Part 2

A live podcast episode featuring conversations with Alex Boris, Dean Ball, and Peter Wildeford exploring AI developments, policy challenges, and forecasts for 2026, covering topics like the RAISE Act, chip sales to China, AI agent capabilities, and potential technological paradigm shifts.
Creator Economy
Startup Founders
AI & Machine Learning
Tech Policy & Ethics
Developer Culture
Greg Brockman
Bernie Sanders
Josh Hawley

Summary Sections

  • Podcast Summary
  • Speakers
  • Key Takeaways
  • Statistics & Facts
  • Compelling StoriesPremium
  • Thought-Provoking QuotesPremium
  • Strategies & FrameworksPremium
  • Similar StrategiesPlus
  • Additional ContextPremium
  • Key Takeaways TablePlus
  • Critical AnalysisPlus
  • Books & Articles MentionedPlus
  • Products, Tools & Software MentionedPlus
0:00/0:00

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.

0:00/0:00

Podcast Summary

This live episode from The Cognitive Revolution features rapid-fire conversations with nine experts analyzing AI developments in 2025 and forecasting what might define 2026. (00:31) The show includes discussions with New York Assemblymember Alex Boris on the RAISE Act and AI safety legislation, former White House AI adviser Dean Ball on emerging political coalitions around AI policy, and forecaster Peter Wildeford on chip bans, agent capabilities, and robotics predictions.

  • Main themes: The episode explores the intersection of AI policy, safety legislation, political dynamics, and technological forecasting, focusing on how regulatory frameworks are evolving alongside rapidly advancing AI capabilities.

Speakers

Alex Boris

Alex Boris is a New York State Assemblymember and computer engineer by training who worked at Palantir for four years. He is the author of the RAISE Act, which focuses on AI safety standards and is currently under negotiation with the New York governor through the chapter amendments process.

Dean Ball

Dean Ball served as a policy adviser on artificial intelligence at the White House Office of Science and Technology Policy before leaving government to focus on writing and analysis. He has been actively commenting on AI policy developments and political coalitions forming around AI regulation.

Peter Wildeford

Peter Wildeford is a policy strategist at the Institute for AI Policy and Strategy and serves on the board of Metaculus. He is recognized as a top 20 globally ranked forecaster and has been actively analyzing AI policy implications, particularly around chip export controls to China.

Key Takeaways

AI Safety Legislation Faces Industry Pushback Despite Moderate Approach

Alex Boris's RAISE Act, which echoes many company preparedness frameworks, has come under attack from a $100 million super PAC backed by Andreessen Horowitz and Greg Brockman. (12:32) The legislation requires safety plans for models that could cause catastrophic risks (100+ deaths or $1 billion in damage via CBRN weapons or fully automated crimes), yet even these extreme thresholds have triggered coordinated opposition. Boris notes the irony that he has a CS masters degree and Palantir experience, yet is still targeted by tech industry opposition. This demonstrates how even technically-informed, moderate safety approaches face significant industry resistance, suggesting that any meaningful regulation will require sustained political will regardless of how reasonable the proposals appear.

Chip Export Controls to China May Be Self-Defeating

Peter Wildeford argues that selling chips to China rather than implementing strict export controls is counterproductive because China doesn't operate like a capitalist market. (50:23) He explains that the Chinese government ensures unlimited demand for Huawei chips through policy requirements, while NVIDIA fills the remaining 96% of demand that Huawei can't supply. When Huawei's capacity increases, the government will push out NVIDIA in favor of domestic alternatives, as seen with Tesla being displaced by BYD and Apple losing market share to Huawei. Wildeford advocates for a "rent, don't sell" approach where China can access American AI capabilities through controlled cloud services rather than owning the underlying hardware, allowing economic benefits while maintaining strategic control.

AI Political Coalitions Are Still Forming

Dean Ball identifies that AI political factions remain in flux, distinguishing between traditional AI safety concerns and emerging anti-AI sentiment. (26:17) He notes there are pro-AI industry groups, traditional AI safety advocates like Boris, and a growing anti-AI coalition spanning from Bernie Sanders calling for data center bans to right-wing concerns about corporate power. The critical question is whether middle-ground voters concerned about child safety and consumer protection will align with safety-focused regulation or anti-AI sentiment. Ball emphasizes this differs from fighting social media's past battles, as many view AI through a narrow consumer technology lens rather than recognizing its broader transformational potential.

2026 Could Be the True "Year of the Agent"

Peter Wildeford predicts that 2026 will finally deliver on autonomous AI agents after 2025's disappointments in this area. (83:31) He points to Meter's evaluation showing AI currently achieves 50% reliability on 2-hour human tasks, but expects this to scale to day-long autonomous work by end of 2026. Combined with improving computer use capabilities, this could enable AI systems to reliably handle complex workflows while humans sleep or focus elsewhere. Even with 80% failure rates, the economic value would be significant due to cost and availability advantages. This progression could trigger the "ChatGPT moment" for robotics and autonomous systems, fundamentally changing how people perceive AI's practical utility and economic impact.

Model Efficiency Gains May Sustain Data Center Economics

The hosts observe that improving model efficiency could justify massive data center investments by enabling older chips to run increasingly capable models over time. (99:09) As an example, a software engineering task requiring a Blackwell chip today might run on an H100 chip in twelve months while delivering the same economic value. This "efficiency dividend" means data centers won't become obsolete as technology advances, but rather will host progressively more capable AI at lower costs. Combined with fallback applications like personalized advertising that guarantee revenue streams, this could sustain the economics of continued massive AI infrastructure investment even if superintelligence timelines extend longer than expected.

Statistics & Facts

  1. A super PAC called Leading the Future was formed with $100 million in funding from Andreessen Horowitz, Greg Brockman, Joe Lonsdale, and Ron Conway to target politicians supporting AI regulation. (12:32)
  2. Huawei can currently supply only 1-4% of Chinese chip demand, with the remaining 96% potentially filled by companies like NVIDIA. (50:23)
  3. AI models showed a 390x improvement on the Arc AGI leaderboard in just one year, demonstrating rapid progress in both capability and cost efficiency. (63:03)

Compelling Stories

Available with a Premium subscription

Thought-Provoking Quotes

Available with a Premium subscription

Strategies & Frameworks

Available with a Premium subscription

Similar Strategies

Available with a Plus subscription

Additional Context

Available with a Premium subscription

Key Takeaways Table

Available with a Plus subscription

Critical Analysis

Available with a Plus subscription

Books & Articles Mentioned

Available with a Plus subscription

Products, Tools & Software Mentioned

Available with a Plus subscription

More episodes like this

In Good Company with Nicolai Tangen
January 14, 2026

Figma CEO: From Idea to IPO, Design at Scale and AI’s Impact on Creativity

In Good Company with Nicolai Tangen
Uncensored CMO
January 14, 2026

Rory Sutherland on why luck beats logic in marketing

Uncensored CMO
We Study Billionaires - The Investor’s Podcast Network
January 14, 2026

BTC257: Bitcoin Mastermind Q1 2026 w/ Jeff Ross, Joe Carlasare, and American HODL (Bitcoin Podcast)

We Study Billionaires - The Investor’s Podcast Network
This Week in Startups
January 13, 2026

How to Make Billions from Exposing Fraud | E2234

This Week in Startups
Swipe to navigate