Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
This episode features Jen Kha, Head of Investor Relations, and David George, General Partner at a16z, examining how AI is fundamentally reshaping late-stage private markets. The discussion centers on AI companies scaling faster than any previous technology cycle while infrastructure buildout reaches unprecedented levels—with major tech companies investing at a $400 billion annual run rate in AI infrastructure. (02:14)
Head of Investor Relations at a16z, responsible for managing relationships with the firm's institutional investors. Kha provides strategic oversight on portfolio construction and market analysis across the late-stage practice.
General Partner at a16z focusing on late-stage investments, particularly in AI and technology companies. George has extensive experience evaluating high-growth private companies and leads the Growth Fund's AI investment strategy, including early positions in companies like XAI.
AI companies are reaching massive scale in a fraction of the time it took previous technology generations. David George reveals that ChatGPT reached 365 billion searches in just two years, while Google took eleven years to achieve the same milestone. (09:36) This acceleration is enabled by building on existing internet infrastructure and cloud computing, allowing for immediate global distribution without requiring new hardware deployment. The lesson for professionals is to recognize that AI-powered solutions can scale globally from day one if properly architected, fundamentally changing go-to-market strategies and growth expectations.
When evaluating AI companies, focus on gross retention rates (90%+ customer stickiness) and organic customer demand rather than obsessing over gross margins in the short term. (23:22) George explains that they're more lenient on AI companies' current gross margins because input costs are declining by 100x every two years, and multiple competing model providers will drive costs down further. For business leaders, this means prioritizing product-market fit and customer love over margin optimization in AI implementations, especially when foundational model costs are rapidly decreasing.
The world's strongest technology companies are bearing the burden of massive AI infrastructure buildout, creating extraordinary opportunities for companies building on top of this foundation. (03:24) With companies like Google, Meta, Amazon, and Microsoft investing $400 billion annually in AI infrastructure, the foundational compute and training capabilities are being funded by entities best equipped to handle potential overcapacity. This means entrepreneurs and businesses can focus on application-layer innovation without worrying about infrastructure scalability, representing a rare moment where the infrastructure layer subsidizes innovation above it.
Professionals should expect that 90% of AI's value creation will accrue to end customers through surplus, while only 10% goes to the companies providing AI services. (06:06) Despite this seemingly small capture rate, the total addressable market is so large—potentially affecting 20% of GDP through white-collar payroll augmentation compared to software's current 1% of GDP—that the 10% captured still represents enormous business opportunities. The strategic implication is to build AI solutions with massive scale potential, as even small capture rates from huge markets create substantial value.
AI applications achieve durability through the same mechanisms as traditional software: integrations, workflows, and company-specific customization rather than model superiority alone. (34:48) George points out that medical scribing, customer support, and financial analysis tools become sticky because they integrate deeply into existing workflows and accumulate company-specific rules and preferences. For business implementation, this means focusing on deep workflow integration and customization rather than just deploying the latest AI model—sustainable competitive advantage comes from operational embedding, not algorithmic advancement.