Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
This GTC DC special episode of the NVIDIA AI Podcast explores the massive infrastructure challenge behind America's AI revolution. (00:37) Industry leaders from Vertiv, Schneider Electric, GE Vernova, and Crusoe discuss the unprecedented scale of data centers, power systems, and cooling infrastructure needed to support AI's exponential growth. The conversation reveals that data centers are becoming the new "AI factories" requiring industrial-scale coordination across energy, cooling, and computing systems. (00:48)
CEO of Vertiv, a global leader in critical digital infrastructure and continuity solutions. Albertazi leads the company's mission to provide power, cooling, and IT systems for data centers worldwide during a time of unprecedented infrastructure demand.
CEO of Schneider Electric, a multinational corporation specializing in energy management and automation solutions. With 32 years at Schneider Electric, Blum has witnessed the company's evolution from primarily hardware to a hardware and digital solutions provider focusing on AI-powered energy efficiency.
CTO of GE Vernova, the energy division spun off from General Electric in 2023. Based in Cambridge, Massachusetts, Janalagagata leads technology strategy for power generation solutions including gas turbines, nuclear, wind, solar, and grid infrastructure.
Co-Founder and CEO of Crusoe, a company building AI factory data centers with innovative power and cooling architectures. Lochmiller leads the company's mission to transform electrons into tokens through extreme co-design of data center infrastructure.
The traditional approach of building data centers component by component is fundamentally inadequate for AI's scale. (11:11) Gio Albertazi emphasized that the industry must shift from decades of component-based thinking to comprehensive system-level design. This means integrating power, cooling, and computing infrastructure from the ground up rather than assembling separate pieces. The 800-volt DC power architecture exemplifies this approach, delivering massive efficiency gains by redesigning the entire power delivery system. Companies implementing this systems thinking can achieve higher power densities, better efficiency, and faster deployment times for their AI infrastructure projects.
Meeting AI's power demands necessitates leveraging every available energy source rather than relying on single solutions. (05:54) Krishna Janalagagata revealed that US power demand, flat for 20 years, will grow 50% over the next two decades, with a third coming from data centers. (06:58) GE Vernova is quadrupling gas turbine production by 2028 while simultaneously advancing nuclear, solar, wind, and hydrogen technologies. This strategy of "electrify now, decarbonize later" allows organizations to meet immediate AI power needs while building toward sustainable long-term solutions.
The relationship between AI and energy infrastructure is bidirectional and self-reinforcing. (03:57) Olivier Blum explained that while AI depends on compute and compute depends on energy, energy efficiency itself increasingly depends on AI intelligence. Modern energy systems use AI for predictive maintenance, grid optimization, and real-time power management. Organizations can leverage this loop by implementing AI-driven energy management systems that optimize power usage while generating the data needed to further improve efficiency algorithms.
The most significant performance improvements come from co-designing hardware and infrastructure together from first principles. (10:15) Chase Lochmiller highlighted how rack power densities have grown from 2-4 kilowatts twenty years ago to 130-140 kilowatts today, with future systems reaching one megawatt per rack. (21:43) This thousand-fold increase requires revolutionary thinking about cooling, power delivery, and system architecture. Companies succeeding in this environment abandon legacy assumptions and redesign entire systems for maximum efficiency and performance.
Large industrial companies recognize they cannot innovate fast enough alone and actively seek startup partnerships to accelerate development. (16:12) Krishna Janalagagata noted that startups bring essential innovation in both hardware solutions like power electronics and cooling systems, and software solutions for grid management and real-time optimization. (19:49) Gio Albertazi emphasized that innovation must happen both organically within large companies and inorganically through partnerships with creative startups and engineers. Organizations can accelerate their infrastructure capabilities by actively partnering with startups rather than trying to develop all solutions internally.