Powering HPC with next-generation CPUs – MIT Technology Review
Sponsored
From genomics to AI, CPUs continue to lead high-performance computing by offering unmatched flexibility, reliability, and scale, says Microsoft Azure’s Evan Burness and Intersect360 Research’s Addison Snell.
In partnership withMicrosoft and AMD
For all the excitement around GPUs—the workhorses of today’s AI revolution—the central processing unit (CPU) remains the backbone of high-performance computing (HPC). CPUs still handle 80% to 90% of HPC workloads globally, powering everything from climate modeling to semiconductor design. Far from being eclipsed, they’re evolving in ways that make them more competitive, flexible, and indispensable than ever.
The competitive landscape around CPUs has intensified. Once dominated almost exclusively by Intel’s x86 chips, the market now includes powerful alternatives based on ARM and even emerging architectures like RISC-V. Flagship examples like Japan’s Fugaku supercomputer demonstrate how CPU innovation is pushing performance to new frontiers. Meanwhile, cloud providers like Microsoft and AWS are developing their own silicon, adding even more diversity to the ecosystem.
What makes CPUs so enduring? Flexibility, compatibility, and cost efficiency are key. As Evan Burness of Microsoft Azure points out, CPUs remain the “it-just-works” technology. Moving complex, proprietary code to GPUs can be an expensive and time-consuming effort, while CPUs typically support software continuity across generations with minimal friction. That reliability matters for businesses and researchers who need results, not just raw power.
Innovation is also reshaping what a CPU can be. Advances in chiplet design, on-package memory, and hybrid CPU-GPU architectures are extending the performance curve well beyond the limits of Moore’s Law. For many organizations, the CPU is the strategic choice that balances speed, efficiency, and cost.
Looking ahead, the relationship between CPUs, GPUs, and specialized processors like NPUs will define the future of HPC. Rather than a zero-sum contest, it’s increasingly a question of fit-for-purpose design. As Addison Snell, co-founder and chief executive officer of Intersect360 Research, notes, science and industry never run out of harder problems to solve.
That means CPUs, far from fading, will remain at the center of the computing ecosystem.
To learn more, read the new report “Designing CPUs for next-generation supercomputing.”
This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff. It was researched, designed, and written by human writers, editors, analysts, and illustrators. AI tools that may have been used were limited to secondary production processes that passed thorough human review.
We’re increasingly developing bonds with chatbots. While that’s safe for some, it’s dangerous for others.
It’s the most transparent estimate yet from one of the big AI companies, and a long-awaited peek behind the curtain for researchers.
Some therapists are using AI during therapy sessions. They’re risking their clients’ trust and privacy in the process.
They’re turning their backs on a technology thought to have saved millions of lives—with the potential to save many more.
Discover special offers, top stories, upcoming events, and more.
Thank you for submitting your email!
It looks like something went wrong.
We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.
© 2025 MIT Technology Review
source
This is a newsfeed from leading technology publications. No additional editorial review has been performed before posting.

