Command Palette

Search for a command to run...

PodMine
a16z Podcast
a16z Podcast•December 8, 2025

The 80-Year Bet: Why Naveen Rao Is Rebuilding the Computer from Scratch

Naveen Rao's Unconventional AI is pursuing analog computing systems inspired by brain physics to create more energy-efficient AI hardware that could fundamentally transform computing and potentially bring us closer to understanding intelligence.
AI & Machine Learning
Tech Policy & Ethics
Hardware & Gadgets
Alex Honnold
Steph Curry
Yann LeCun
Naveen Rao
Matt Bornstein

Summary Sections

  • Podcast Summary
  • Speakers
  • Key Takeaways
  • Statistics & Facts
  • Compelling StoriesPremium
  • Thought-Provoking QuotesPremium
  • Strategies & FrameworksPremium
  • Similar StrategiesPlus
  • Additional ContextPremium
  • Key Takeaways TablePlus
  • Critical AnalysisPlus
  • Books & Articles MentionedPlus
  • Products, Tools & Software MentionedPlus
0:00/0:00

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.

0:00/0:00

Podcast Summary

In this fascinating conversation at NeurIPS, Naveen Rao, CEO of Unconventional AI, sits down with a16z's Matt Bornstein to discuss his audacious mission to revolutionize computing through analog systems. (03:57) Rao argues that 80 years of digital computing may be fundamentally mismatched for AI workloads, pointing to the stark efficiency gap between biological intelligence and current data centers. While a human brain operates on just 20 watts of power, AI data centers now consume 4% of the entire US energy grid, with demand projecting to require 400 additional gigawatts over the next decade. (10:48) Rao's company is betting on analog computing systems that mimic the physics of neural networks rather than simulating them digitally, potentially offering a path to both AGI and sustainable AI scaling.

  • Main Theme: The conversation explores why analog computing architectures, specifically designed to mirror biological neural systems, could be the key to achieving both artificial general intelligence and sustainable AI scaling in an energy-constrained world.

Speakers

Naveen Rao

Naveen Rao is cofounder and CEO of Unconventional AI, an AI chip startup building analog computing systems designed specifically for intelligence. Previously, Naveen led AI at Databricks and founded two successful companies: Mosaic (cloud computing) and Nervana (AI accelerators, acquired by Intel). He holds a PhD in neuroscience and has extensive experience across the full technology stack, from silicon design to applications.

Matt Bornstein

Matt Bornstein is a partner at Andreessen Horowitz (a16z), where he focuses on AI and infrastructure investments. He hosts conversations with leading technologists and entrepreneurs at major industry conferences like NeurIPS, exploring the future of artificial intelligence and computing.

Key Takeaways

Energy Efficiency Demands Fundamental Computing Architecture Changes

The current digital computing paradigm is hitting fundamental energy constraints that threaten AI's scalability. (10:48) Rao reveals that US data centers already consume 4% of the national energy grid, with projections requiring 400 additional gigawatts over the next decade just for AI workloads. This represents a species-scale mobilization of resources that our 1970s-era transmission infrastructure may not support. The solution isn't just building more power generation capacity, but fundamentally rethinking how we compute. Biological systems like mammalian brains achieve remarkable intelligence with orders of magnitude less energy - a squirrel brain operates on just a tenth of a watt while performing complex real-time processing. This efficiency gap suggests we need computing architectures that work with physics rather than against it.

Analog Computing Can Eliminate Lossy Digital Abstractions

Digital computers implement intelligence through multiple layers of abstraction that create inherent inefficiencies and energy losses. (09:53) As Rao explains, in biological brains "the neural network dynamics are implemented physically. So there is no abstraction. Intelligence is the physics. They're one and the same." Digital systems simulate neural networks through numerical approximations, operating systems, APIs, and other software layers that consume energy without adding computational value. Analog systems can potentially implement neural network dynamics directly in electrical circuits, eliminating these abstraction penalties. This approach leverages the inherent physics of the substrate to perform computation, similar to how wind tunnels use actual airflow rather than computational fluid dynamics to model aerodynamic behavior.

Stochastic Intelligence Workloads Don't Require Deterministic Precision

The mismatch between AI's probabilistic nature and digital computing's deterministic precision represents a fundamental inefficiency. (08:42) Rao questions "why are we using the substrate that is highly precise and deterministic for something that's actually stochastic and distributed in nature." Neural networks are inherently probabilistic machines that work with uncertainty and approximations, yet we're running them on systems designed for exact arithmetic calculations like artillery trajectories or financial computations. This precision comes at an enormous energy cost that may be unnecessary for intelligence tasks. Biological neural networks achieve remarkable accuracy in complex, variable environments - like Steph Curry making precise shots under constantly changing game conditions - without requiring the deterministic precision of digital systems.

Time and Causality Are Missing Elements in Current AI

Current AI systems lack genuine understanding of causality because they're built on substrates that don't inherently incorporate time dynamics. (16:39) Rao argues that "anything where the basis is dynamic, which has time and causality as part of it, will be a better basis than something that's not." Digital systems simulate time through numerical approximations, but analog systems can implement time evolution directly through physical processes. This distinction may be crucial for achieving AGI-level understanding, as causality requires temporal relationships that are fundamental to how we perceive and interact with the physical world. Children innately understand cause and effect through their physical interactions with the environment, suggesting that intelligence naturally emerges from time-based dynamic systems rather than static numerical computations.

Cross-Stack Thinking Enables Breakthrough Innovation

The most transformative innovations come from people who can think across traditional boundaries between hardware, software, and applications. (02:57) Rao emphasizes that "software and hardware is not really natural boundary. It's just where we decide to draw the line and say, okay. This is something I make configurable or I don't." His ability to work across the full technology stack - from silicon design to neuroscience to applications - enables him to see optimization opportunities that specialists within single domains might miss. (26:32) For young professionals, he recommends gaining breadth early in their careers: "being really good at one thing is probably less valuable than being very good at but slightly good at a lot of things" when preparing for future technological changes.

Statistics & Facts

  1. AI data centers currently consume 4% of the entire US energy grid, with the US representing about 50% of global data center capacity. (10:48) This statistic illustrates the massive energy demands of current AI infrastructure and sets the stage for understanding why alternative computing paradigms are urgently needed.
  2. The AI industry will require approximately 400 additional gigawatts of energy capacity over the next decade to meet projected demand. (11:33) Rao emphasizes that current power generation capabilities can only add about 4 gigawatts per year, creating a massive shortfall that infrastructure alone cannot solve.
  3. Mammalian brains achieve remarkable computational efficiency, with a squirrel brain operating on just a tenth of a watt while human brains use approximately 20 watts. (09:35) This comparison highlights the enormous efficiency gap between biological intelligence and current AI systems, suggesting significant room for architectural improvements.

Compelling Stories

Available with a Premium subscription

Thought-Provoking Quotes

Available with a Premium subscription

Strategies & Frameworks

Available with a Premium subscription

Similar Strategies

Available with a Plus subscription

Additional Context

Available with a Premium subscription

Key Takeaways Table

Available with a Plus subscription

Critical Analysis

Available with a Plus subscription

Books & Articles Mentioned

Available with a Plus subscription

Products, Tools & Software Mentioned

Available with a Plus subscription

More episodes like this

Young and Profiting with Hala Taha (Entrepreneurship, Sales, Marketing)
January 14, 2026

The Productivity Framework That Eliminates Burnout and Maximizes Output | Productivity | Presented by Working Genius

Young and Profiting with Hala Taha (Entrepreneurship, Sales, Marketing)
The Prof G Pod with Scott Galloway
January 14, 2026

Raging Moderates: Is This a Turning Point for America? (ft. Sarah Longwell)

The Prof G Pod with Scott Galloway
On Purpose with Jay Shetty
January 14, 2026

MEL ROBBINS: How to Stop People-Pleasing Without Feeling Guilty (Follow THIS Simple Rule to Set Boundaries and Stop Putting Yourself Last!)

On Purpose with Jay Shetty
The James Altucher Show
January 14, 2026

From the Archive: Sara Blakely on Fear, Failure, and the First Big Win

The James Altucher Show
Swipe to navigate