Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
Microsoft's head of cloud and AI, Scott Guthrie, provides insight into the massive AI infrastructure buildout currently taking place across the tech industry. (00:59) The discussion centers around whether the recent $143 billion in combined AI investments from NVIDIA, Oracle, and Anthropic represents overinvestment or necessary infrastructure for the future. (01:16)
Scott Guthrie is the Executive Vice President of Cloud and AI at Microsoft, overseeing Azure and AI initiatives. With 27-28 years at Microsoft, he leads one of the world's largest cloud computing platforms and has been instrumental in Microsoft's AI transformation and partnership with OpenAI.
Alex Kantrowitz is the host of Big Technology Podcast and a technology journalist focused on providing nuanced analysis of the tech industry. He regularly appears on CNBC discussing technology earnings and market developments.
The AI industry remains more supply-constrained than demand-constrained, indicating continued growth potential. (02:45) Guthrie emphasizes that as people use AI and get value from it, they use it more, creating a positive feedback loop that drives infrastructure needs. This suggests the current buildout is justified by actual demand rather than speculative investment.
Success in AI infrastructure depends on maximizing "tokens per watt per dollar" across multiple applications and timeframes. (07:05) Microsoft leverages its diverse portfolio including Microsoft 365 Copilot, GitHub Copilot, ChatGPT, and enterprise applications to ensure optimal utilization of AI infrastructure investments, reducing risk compared to single-purpose data centers.
Geopolitical considerations require AI infrastructure to be distributed globally rather than centralized. (12:07) European customers want AI processing in Europe, Asian customers in Asia, creating demand for regional infrastructure that can also provide lower latency for inference workloads while meeting data sovereignty requirements.
Modern AI training encompasses multiple types including pre-training, post-training, reinforcement learning, and fine-tuning. (11:20) This evolution allows for more flexible infrastructure utilization, where post-training can happen on distributed infrastructure during off-peak hours, maximizing resource efficiency and reducing the need for massive centralized training facilities.
Azure's consumption-based revenue model provides real-time validation of AI value delivery. (44:06) With 39% year-over-year growth acceleration, the revenue reflects actual AI usage rather than speculative purchases, indicating enterprises are finding genuine value in AI applications despite some studies suggesting poor ROI.