Listen "Open Compute Project Summit"
Episode Synopsis
The equipment that fills data centers is evolving rapidly, driven by the need to fulfill the seemingly insatiable appetite of AI applications. The Open Compute Project (OCP) was founded by Meta/Facebook to promulgate equipment standards and its annual Summit has grown from a small specialized gathering, to an event that strains the capacity of the San Jose Convention Center. Senior research analyst Perkins Liu returns to offer his take on this meteoric growth with host Eric Hanselman. AI requirements are pushing ever greater scale both logically and physically, with the width of server racks doubling in the Open Rack Wide (ORW) specification to support greater density and better serviceability. The OCP Foundation is also working on silicon interoperability and is setting specifications for chiplet integration. Liquid cooling has moved from a nice to have feature to a required capability as a means to dissipate the huge amount of energy drawn by ever denser GPU arrays. Energy delivery is changing with the advent of higher voltage DC power. The early OCP efforts on 48 volt DC are paling in the face of new 800 volts designs. The OCP Foundation is also expanding its mission to include education, with the establishment of the OCP Academy. It aims to raise workforce skills in open hardware and will offer online training in data center technologies. That underscores not only the expansion of the OCP Foundation's mission, but also the increasing scale of the ecosystem that supports data center environments and complexity and interdependency that AI creates. More S&P Global Content: Sustainability continues to drive datacenter infrastructure evolution Webinar: Talk to the Expert - Artificial intelligence, datacenters and energy: Is APAC ready for th… For S&P Global subscribers: Air cooling remains prevalent, but liquid cooling is gaining momentum – Highlights from VotE: Datac… Adjusted definitions of datacenter markets in China align with socioeconomic processes Datacenters increasingly use direct current to cope with AI workloads Credits: Host/Author: Eric Hanselman Guest: Perkins Liu Producer/Editor: Feranmi Adeoshun Published With Assistance From: Sophie Carr, Kyra Smith
More episodes of the podcast Next in Tech
A Wild Earnings Season
16/01/2026
The Agentic Enterprise
13/01/2026
AWS re:Invent conference
23/12/2025
SC25 Supercomputing Conference
16/12/2025
Security and Observability
09/12/2025
Context Engineering
02/12/2025
The Big Picture Reports
25/11/2025
Agentic Customer Experience
18/11/2025
Money 20/20
11/11/2025
Security Gravity
28/10/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.