Close

AI Expected to Revolutionize Data Centers, According to OCP

The Open Compute Project (OCP), a leading industry initiative focused on reimagining hardware to meet the ever-growing demands of infrastructure, has shifted its attention towards the hardware prerequisites of artificial intelligence (AI). This transition is poised to bring about a significant impact, chiefly marked by the advent of liquid-cooled data centers and escalated power consumption. Notably, Andy Bechtolsheim, a member of the OCP board and co-founder of Sun Microsystems, is one of the prominent advocates for this development.

At the recent OCP Global Summit held in San Jose, California, the discussion surrounding the implications of AI on computer hardware took center stage. Zaid Kahn, the OCP Board Chair and General Manager of Microsoft’s silicon, cloud hardware, and infrastructure engineering, emphasized that AI is not merely a trend but a fundamental shift in technology that will profoundly influence our lives. Kahn foresees AI driving substantial investments in IT infrastructure and the expansion of data centers in the very near future.

Loi Nguyen, the Executive Vice President and General Manager for optical at Marvell, a company specializing in cloud and data center infrastructure technology, concurred. He stated, “When we look back ten years from now, most of you will agree with me that 2023 marked the inception of a new era for AI. The world will be markedly different a decade from now.”

However, AI introduces numerous challenges in system design, as pointed out by Zane Ball, Vice President and General Manager for Data Center Platform Engineering and Architecture at Intel. Ball highlighted power consumption as the most pressing concern for AI users, with AI models expanding tenfold each year, necessitating extensive infrastructure expansion that consumes vast amounts of electricity. Ball asserted that liquid cooling will become ubiquitous, offering a 30% reduction in power consumption at the data center level. He also noted that even central processing units (CPUs) will require liquid cooling and that Intel plans to invest in enhancing CPUs to make AI more practical for standard servers.

Andy Bechtolsheim, Chief Development Officer and Co-founder at Arista Networks, further extolled the advantages of water-cooled systems during his presentation, following the keynote address. Bechtolsheim declared that the era of cooled data centers has dawned with the advent of AI. He acknowledged the complexities associated with liquid cooling compared to traditional air cooling but emphasized the growing preference for deploying large data center designs based on liquid cooling.

Recapping recent AI developments such as ChatGPT and Amazon Bedrock, AWS’s new generative AI service, Marvell’s Nguyen predicted that AI would pave the way for innovations spanning personalized healthcare, climate change mitigation, and even communication with marine life such as whales. Nonetheless, these innovations will necessitate larger data centers and increased power consumption.