Command Palette

Search for a command to run...

PodMine
The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch
The Twenty Minute VC (20VC): Venture Capital | Startup Funding | The Pitch•September 29, 2025

20VC: OpenAI and Anthropic Will Build Their Own Chips | NVIDIA Will Be Worth $10TRN | How to Solve the Energy Required for AI... Nuclear | Why China is Behind the US in the Race for AGI with Jonathan Ross, Groq Founder

An in-depth exploration of the AI compute landscape, highlighting the critical role of energy, chip development, and the transformative potential of AI across industries, with insights from Jonathan Ross, founder of Groq.
AI & Machine Learning
Tech Policy & Ethics
Developer Culture
Hardware & Gadgets
Elon Musk
Sam Altman
Jonathan Ross
OpenAI

Summary Sections

  • Podcast Summary
  • Speakers
  • Key Takeaways
  • Statistics & Facts
  • Compelling StoriesPremium
  • Thought-Provoking QuotesPremium
  • Strategies & FrameworksPremium
  • Similar StrategiesPlus
  • Additional ContextPremium
  • Key Takeaways TablePlus
  • Critical AnalysisPlus
  • Books & Articles MentionedPlus
  • Products, Tools & Software MentionedPlus
0:00/0:00

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.

0:00/0:00

Podcast Summary

Jonathan Ross, founder and CEO of Groq, returns to 20 VC for a comprehensive discussion about the current state of AI infrastructure and compute. (00:21) Ross, who previously led the TPU team at Google, explains that while many question whether there's an AI bubble, the real indicator is what smart money is doing - major tech companies continue doubling down on AI spending. (05:26) He argues that the demand for compute is insatiable, with companies like OpenAI and Anthropic able to nearly double their revenue if given twice their current inference compute capacity. (12:26) The conversation covers everything from NVIDIA's market position and energy requirements to the geopolitical implications of AI compute control.

  • Main themes include compute scarcity as the primary bottleneck in AI development, the critical importance of speed in AI applications, and the geopolitical race for AI supremacy through energy and compute infrastructure.

Speakers

Jonathan Ross

Jonathan Ross is the founder and CEO of Groq, an AI chip company focused on inference at scale, which has raised over $3 billion with a recent valuation approaching $7 billion. Before founding Groq, Ross led the team that built Google's TPU (Tensor Processing Unit), making him one of the key architects of modern AI hardware infrastructure.

Key Takeaways

Compute Scarcity Is the Real Bottleneck

The most striking insight from Ross is that compute availability, not performance, has become the primary value proposition for AI infrastructure companies. (23:17) Ross describes customers requesting 5x their total capacity, which no one in the industry can fulfill. This scarcity means that if major AI companies like OpenAI or Anthropic doubled their inference compute, their revenue would almost double within a month due to current rate limiting constraints. The practical implication is that businesses should prioritize securing compute capacity over optimizing for marginal performance gains.

Speed Creates Exponential Value in AI Applications

Ross draws a compelling parallel between AI responsiveness and consumer products, noting that high-margin consumer goods correlate with speed of action - tobacco acts fastest, followed by soft drinks. (13:25) This principle applies directly to AI applications, where every 100 milliseconds of speed improvement results in approximately 8% conversion rate increases. The lesson for professionals is that investing in faster AI infrastructure isn't just about user experience - it's about creating measurable business value through improved engagement and brand affinity.

Geographic Energy Access Will Determine AI Leadership

Countries that control compute will control AI, and compute requires energy infrastructure. (36:41) Ross argues that Europe could compete effectively if it leveraged resources like Norway's wind capacity, which could theoretically provide as much energy as the entire United States. (34:01) For business leaders, this means considering geographic location as a strategic advantage, particularly for companies requiring significant compute resources. The message is clear: locate operations where energy is abundant and cheap.

The AI Economy Will Create Labor Shortages, Not Unemployment

Contrary to popular fears about AI-driven unemployment, Ross predicts massive labor shortages due to three factors: deflationary pressure making life cheaper, people opting to work less due to lower costs, and entirely new job categories emerging. (44:02) He draws parallels to how 98% of the workforce moved from agriculture to other sectors over the past century. For professionals, this suggests focusing on skills that complement AI rather than compete with it, and preparing for a world where human creativity and strategic thinking become even more valuable.

Building Your Own Infrastructure Creates Control, Not Just Savings

When Ross explains why companies like OpenAI will build their own chips, the primary benefit isn't cost savings - it's control over destiny. (17:05) Custom infrastructure prevents suppliers from dictating allocation and ensures capacity when needed. Ross shares how Google once built 10,000 AMD servers just to get better Intel pricing, demonstrating that the real value lies in negotiating power and supply chain control. For businesses, this principle applies beyond chips: owning critical infrastructure components provides strategic flexibility that often outweighs pure cost considerations.

Statistics & Facts

  1. 10% of the world's population uses ChatGPT as weekly active users, demonstrating unprecedented adoption speed for an AI application. (40:37)
  2. 35-36 companies are responsible for 99% of all AI token spending and revenue globally, showing extreme concentration in the current AI market. (06:15)
  3. Only 14% of chips work correctly on first fabrication (A0 silicon), making chip development extremely risky with typical 86% failure rates requiring expensive re-spins. (72:24)

Compelling Stories

Available with a Premium subscription

Thought-Provoking Quotes

Available with a Premium subscription

Strategies & Frameworks

Available with a Premium subscription

Similar Strategies

Available with a Plus subscription

Additional Context

Available with a Premium subscription

Key Takeaways Table

Available with a Plus subscription

Critical Analysis

Available with a Plus subscription

Books & Articles Mentioned

Available with a Plus subscription

Products, Tools & Software Mentioned

Available with a Plus subscription

More episodes like this

In Good Company with Nicolai Tangen
January 14, 2026

Figma CEO: From Idea to IPO, Design at Scale and AI’s Impact on Creativity

In Good Company with Nicolai Tangen
Uncensored CMO
January 14, 2026

Rory Sutherland on why luck beats logic in marketing

Uncensored CMO
We Study Billionaires - The Investor’s Podcast Network
January 14, 2026

BTC257: Bitcoin Mastermind Q1 2026 w/ Jeff Ross, Joe Carlasare, and American HODL (Bitcoin Podcast)

We Study Billionaires - The Investor’s Podcast Network
This Week in Startups
January 13, 2026

How to Make Billions from Exposing Fraud | E2234

This Week in Startups
Swipe to navigate