Surfing the waves that Nvidia’s kicking up comes Supermicro, a company that’s long had much of its destiny (and stock prices) tied to the fortunes of the chip making giant. Data centers need server rack solutions; processors need to be mounted. In the middle, Supermicro is making bank. This last quarter, its revenue shot up 200% over last year. Analysts are breathlessly predicting that the company’s top line might even double over the next fiscal year or two, while enterprises pound on the doors, demanding the AI servers that’ll help them grow, transform, revolutionize and other buzzwordy AI verbs, expanding the market at a compound annual rate of 25% through 2029.
Elon Musk, who somehow always manages to weasel his way into the big news of the day, is a big part of the Supermicro parade, recently announcing that Dell and Supermicro will each be providing half the servers for AI start-up xAI and his superdupercomputer dreams. And surprising some, lately Supermicro’s growth is still outstripping Dell.
Part of the secret, and part of what will make this kind of growth sustainable, is the 5,000 racks full of kit the company will pumping out each month in its new Malaysian factory in Q4; the other part is the company’s proprietary direct liquid cooling (DLC) tech. During his recent keynote speech at Taiwan’s Computex event, Supermicro CEO Charles Liang predicted their DLC will rack up 2,900% growth in two years. It’ll be installed in 15% of the racks the company ships this year, and double by next year. And he predicts we’ll see 20% of datacenters adopt liquid cooling pretty quick here. Liquid-cooled datacenters consume less energy and allow denser and more productive deployments, he added, meaning more productive datacenters — and a challenge to the new entrants in the AI inference field who want to ditch the GPUs altogether.
Liang has plenty more to say about the critical infrastructure decisions facing enterprises today, and he’s diving into the conversation at VentureBeat’s Transform 2024. He’ll be talking about the ways specialized solutions purpose-built for AI compute are changing, and why enterprises need to keep up, the endlessly delicate balance of data center resources, including managing energy-gobbling GPUs and their cooling and power demands, data center footprints and more. And he’ll take a look at a future where GPUs stand supreme, with the release of the upcoming Nvidia Blackwell GPU architecture alongside technology like direct-to-chip liquid cooling, designed to address all your most pressing “but the environment!” arguments.
Register now for VB Transform 2024 to get in the room with Liang and other industry giants. They’ll be bringing the latest news, the newest gossip and unparalleled opportunities to network awayx in San Francisco, July 9, 10 and 11. This year the event is all about putting AI to work at scale — and the case studies that demonstrate exactly how it’s done in the real world. Register now!
Credit: Source link