Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
In this episode, Nathan Sobo, founder of Zed and former creator of Atom at GitHub, challenges the conventional wisdom that terminal-based AI tools spell the death of IDEs. (00:14) Nathan argues that source code is fundamentally a language designed for humans to read, not just machines to execute, making visual interfaces essential even in an AI-driven future. (03:31) He shares his journey from building Atom in web technology to creating Zed in Rust with GPU-accelerated rendering, achieving near-zero perceptible lag when typing. (12:57) The conversation explores Zed's positioning as "Switzerland" for different AI agents through the Agent Client Protocol, Nathan's vision for fine-grained edit tracking that enables permanent conversations anchored to code, and why he believes the future of coding involves richer collaboration between humans and AI agents rather than replacing human interaction with source code entirely.
Nathan Sobo is the founder of Zed and a veteran IDE developer with nearly two decades of experience. He was one of the first engineers to work on Atom at GitHub, where he also helped create Electron, which became the foundation for many desktop applications including VS Code. After experiencing the performance limitations of web-based editors, Nathan founded Zed in 2017 and rebuilt the IDE from scratch in Rust with GPU-accelerated rendering, serving over 170,000 active developers today.
Despite the rise of terminal-based AI coding tools, Nathan argues that human developers will always need to visually inspect and understand code. (02:28) As he puts it, "source code is a language, just like natural language is a language" and programs are written "for people to read and only incidentally for machines to execute." (03:36) Even when using AI agents, developers need to review changes, understand context, and verify outputs—tasks that require proper visual interfaces rather than small terminal windows.
Nathan emphasizes that performance isn't a feature you can add later—it's an architectural decision that must be built from the ground up. (10:24) After hitting performance ceilings with Atom's web-based architecture, he rebuilt Zed in Rust with GPU-accelerated rendering to achieve near-zero perceptible lag. (12:57) This focus on performance attracts professional developers who use their tools 40+ hours per week and care about the tactile experience of their development environment.
The traditional Git-based, asynchronous collaboration model breaks down when working with AI agents who make continuous edits. (14:14) Nathan envisions a system with "fine-grained tracking mechanism that's the equivalent of having a commit on every keystroke" to anchor conversations directly to specific code changes. (15:58) This enables permanent, contextual conversations tied to code evolution rather than just snapshot-based reviews.
Nathan's experience shows LLMs are excellent "knowledge extruders"—taking well-established patterns and adapting them to specific needs. (28:41) He successfully used AI to generate Rust procedural macros and GPU rendering pipelines by leveraging existing knowledge in new configurations. However, when working on novel problems like Delta DB that require holding multiple complex constraints simultaneously, LLMs provide less value because "the code is not the constraint—the thinking is." (30:01)
Nathan envisions transforming code from a static artifact into a "metadata backbone" where conversations, edits, and context all hang together permanently. (34:54) This would enable LLMs to ask "what are all the conversations that happened behind this code" and provide developers with stable references to code locations that persist through changes. (34:34) This vision transforms the development environment from a simple editing tool into a collaborative knowledge system.