Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
In this fascinating conversation at NeurIPS, Naveen Rao, CEO of Unconventional AI, sits down with a16z's Matt Bornstein to discuss his audacious mission to revolutionize computing through analog systems. (03:57) Rao argues that 80 years of digital computing may be fundamentally mismatched for AI workloads, pointing to the stark efficiency gap between biological intelligence and current data centers. While a human brain operates on just 20 watts of power, AI data centers now consume 4% of the entire US energy grid, with demand projecting to require 400 additional gigawatts over the next decade. (10:48) Rao's company is betting on analog computing systems that mimic the physics of neural networks rather than simulating them digitally, potentially offering a path to both AGI and sustainable AI scaling.
Naveen Rao is cofounder and CEO of Unconventional AI, an AI chip startup building analog computing systems designed specifically for intelligence. Previously, Naveen led AI at Databricks and founded two successful companies: Mosaic (cloud computing) and Nervana (AI accelerators, acquired by Intel). He holds a PhD in neuroscience and has extensive experience across the full technology stack, from silicon design to applications.
Matt Bornstein is a partner at Andreessen Horowitz (a16z), where he focuses on AI and infrastructure investments. He hosts conversations with leading technologists and entrepreneurs at major industry conferences like NeurIPS, exploring the future of artificial intelligence and computing.
The current digital computing paradigm is hitting fundamental energy constraints that threaten AI's scalability. (10:48) Rao reveals that US data centers already consume 4% of the national energy grid, with projections requiring 400 additional gigawatts over the next decade just for AI workloads. This represents a species-scale mobilization of resources that our 1970s-era transmission infrastructure may not support. The solution isn't just building more power generation capacity, but fundamentally rethinking how we compute. Biological systems like mammalian brains achieve remarkable intelligence with orders of magnitude less energy - a squirrel brain operates on just a tenth of a watt while performing complex real-time processing. This efficiency gap suggests we need computing architectures that work with physics rather than against it.
Digital computers implement intelligence through multiple layers of abstraction that create inherent inefficiencies and energy losses. (09:53) As Rao explains, in biological brains "the neural network dynamics are implemented physically. So there is no abstraction. Intelligence is the physics. They're one and the same." Digital systems simulate neural networks through numerical approximations, operating systems, APIs, and other software layers that consume energy without adding computational value. Analog systems can potentially implement neural network dynamics directly in electrical circuits, eliminating these abstraction penalties. This approach leverages the inherent physics of the substrate to perform computation, similar to how wind tunnels use actual airflow rather than computational fluid dynamics to model aerodynamic behavior.
The mismatch between AI's probabilistic nature and digital computing's deterministic precision represents a fundamental inefficiency. (08:42) Rao questions "why are we using the substrate that is highly precise and deterministic for something that's actually stochastic and distributed in nature." Neural networks are inherently probabilistic machines that work with uncertainty and approximations, yet we're running them on systems designed for exact arithmetic calculations like artillery trajectories or financial computations. This precision comes at an enormous energy cost that may be unnecessary for intelligence tasks. Biological neural networks achieve remarkable accuracy in complex, variable environments - like Steph Curry making precise shots under constantly changing game conditions - without requiring the deterministic precision of digital systems.
Current AI systems lack genuine understanding of causality because they're built on substrates that don't inherently incorporate time dynamics. (16:39) Rao argues that "anything where the basis is dynamic, which has time and causality as part of it, will be a better basis than something that's not." Digital systems simulate time through numerical approximations, but analog systems can implement time evolution directly through physical processes. This distinction may be crucial for achieving AGI-level understanding, as causality requires temporal relationships that are fundamental to how we perceive and interact with the physical world. Children innately understand cause and effect through their physical interactions with the environment, suggesting that intelligence naturally emerges from time-based dynamic systems rather than static numerical computations.
The most transformative innovations come from people who can think across traditional boundaries between hardware, software, and applications. (02:57) Rao emphasizes that "software and hardware is not really natural boundary. It's just where we decide to draw the line and say, okay. This is something I make configurable or I don't." His ability to work across the full technology stack - from silicon design to neuroscience to applications - enables him to see optimization opportunities that specialists within single domains might miss. (26:32) For young professionals, he recommends gaining breadth early in their careers: "being really good at one thing is probably less valuable than being very good at but slightly good at a lot of things" when preparing for future technological changes.