Command Palette

Search for a command to run...

PodMine
The Next Wave - AI and The Future of Technology
The Next Wave - AI and The Future of Technology•November 12, 2025

The Hidden Industry Powering Every AI Company

A deep dive into the complex world of AI compute infrastructure, exploring how data centers, GPU clusters, and financial engineering are shaping the future of technological innovation and global AI competition.
Startup Founders
Venture Capital
AI & Machine Learning
Tech Policy & Ethics
Data Centers
Elon Musk
Sam Altman
Evan Conrad

Summary Sections

  • Podcast Summary
  • Speakers
  • Key Takeaways
  • Statistics & Facts
  • Compelling StoriesPremium
  • Thought-Provoking QuotesPremium
  • Strategies & FrameworksPremium
  • Similar StrategiesPlus
  • Additional ContextPremium
  • Key Takeaways TablePlus
  • Critical AnalysisPlus
  • Books & Articles MentionedPlus
  • Products, Tools & Software MentionedPlus
0:00/0:00

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.

0:00/0:00

Podcast Summary

In this deep dive into AI infrastructure, Nathan Lands interviews Evan Conrad, CEO of SF Compute, about the massive physical build-out powering the AI revolution. (00:19) Conrad explains how the AI industry faces a potential credit risk bubble due to misaligned contract structures between GPU providers and customers. While GPU cloud providers need long-term contracts to secure financing, AI companies prefer short-term flexibility, creating a dangerous financial mismatch that could cascade through the entire ecosystem if venture funding tightens.

  • Main themes: AI compute infrastructure economics, credit risk in GPU financing, US-China competition in data centers, and the transformation of compute into a tradable commodity market

Speakers

Nathan Lands

Nathan Lands is the host of The Next Wave podcast and co-founder of Lore.com. He has experience studying Mandarin in Taiwan and connections with Chinese government officials, giving him unique insights into US-China tech competition. Nathan is currently exploring opportunities in data center acquisition and operations.

Evan Conrad

Evan Conrad is the CEO of SF Compute, a company creating a spot market for AI compute that transforms supercomputers into tradable commodities. Previously, he founded June Lark, an AI audio model company similar to Suno or Udio, before pivoting to solve the compute financing crisis. Conrad has deep expertise in data center economics and GPU cluster management, positioning SF Compute as a critical infrastructure provider for the AI industry.

Key Takeaways

AI Infrastructure Operates on Credit Risk, Not Tech Innovation

The AI compute industry faces a fundamental mismatch between financing needs and customer demands. (02:42) GPU cloud providers need long-term contracts (often 3+ years) to secure financing for expensive clusters, but AI customers want short-term flexibility to avoid getting locked into potentially obsolete technology. This creates a credit risk bubble where venture-backed startups with thin margins are essentially backing the entire compute infrastructure through their ability to raise capital. When venture funding tightens, this house of cards could collapse, taking down inference providers, GPU clusters, and their debt providers in sequence.

Compute Margins Are Fundamentally Different From Traditional Cloud

Unlike traditional CPU-based cloud services where providers enjoy 60-70% margins, GPU compute operates on razor-thin margins of around 20%. (08:57) This happens because AI companies are extremely price-sensitive and care deeply about which specific GPUs they're using, unlike traditional software companies that don't even think about CPU specifications. The thin margins mean GPU providers have little buffer for demand fluctuations, requiring much more careful financial engineering and longer-term contracts to remain viable.

Physical Infrastructure Will Limit AI Growth Before Chip Supply

The biggest constraint facing AI development isn't chip manufacturing but power generation and distribution. (15:24) Current AI expansion plans would require adding 100+ gigawatts of power capacity—equivalent to doubling the US's 100 nuclear power plants—within just a few years. This represents a potential 10-20% increase in total US electricity load from a single industry. Countries that can rapidly deploy power infrastructure, particularly China with its centralized decision-making, may gain significant advantages in the AI race regardless of chip technology.

Bureaucratic Friction Is America's Biggest AI Disadvantage

Environmental regulations like California's CEQA (California Environmental Quality Act) require extensive documentation for every possible impact of new construction projects, even hypothetical ones like identifying every bird species that might fly into a bridge. (23:54) While these laws had good intentions, they've evolved into bureaucratic obstacles that slow critical infrastructure development. This regulatory burden, combined with cultural shifts toward work-life balance in tech hubs, gives countries like China structural advantages in rapidly deploying AI infrastructure.

Compute Should Be Treated Like Real Estate, Not Startups

Successful compute infrastructure requires thinking like a real estate investor rather than a tech entrepreneur. (26:37) The key is securing long-term off-take agreements (essentially pre-signed rental contracts) before purchasing expensive GPU clusters, then using those contracts to secure favorable financing. Unlike software startups where product differentiation creates margin opportunities, GPU customers only care about hardware costs, making operational efficiency and financial engineering the primary competitive advantages rather than product innovation.

Statistics & Facts

  1. The current total electrical load capacity of the United States is approximately 1.3 terawatts, and proposed AI infrastructure expansion could add 100+ gigawatts, representing a 10-20% increase in national electricity demand from a single industry. (15:45)
  2. GPU cloud providers typically operate on margins around 20%, compared to traditional CPU cloud providers who enjoy 60-70% margins, making them much more vulnerable to demand fluctuations. (08:57)
  3. The United States currently has approximately 100 nuclear power plants, and the proposed AI compute expansion would effectively require doubling this capacity to meet power demands. (16:30)

Compelling Stories

Available with a Premium subscription

Thought-Provoking Quotes

Available with a Premium subscription

Strategies & Frameworks

Available with a Premium subscription

Similar Strategies

Available with a Plus subscription

Additional Context

Available with a Premium subscription

Key Takeaways Table

Available with a Plus subscription

Critical Analysis

Available with a Plus subscription

Books & Articles Mentioned

Available with a Plus subscription

Products, Tools & Software Mentioned

Available with a Plus subscription

More episodes like this

The Prof G Pod with Scott Galloway
January 14, 2026

Raging Moderates: Is This a Turning Point for America? (ft. Sarah Longwell)

The Prof G Pod with Scott Galloway
Young and Profiting with Hala Taha (Entrepreneurship, Sales, Marketing)
January 14, 2026

The Productivity Framework That Eliminates Burnout and Maximizes Output | Productivity | Presented by Working Genius

Young and Profiting with Hala Taha (Entrepreneurship, Sales, Marketing)
On Purpose with Jay Shetty
January 14, 2026

MEL ROBBINS: How to Stop People-Pleasing Without Feeling Guilty (Follow THIS Simple Rule to Set Boundaries and Stop Putting Yourself Last!)

On Purpose with Jay Shetty
Tetragrammaton with Rick Rubin
January 14, 2026

Joseph Nguyen

Tetragrammaton with Rick Rubin
Swipe to navigate