Nvidia’s Exciting Teaser: Quantum-Accelerated Supercomputers!

Nvidia made an announcement at ISC High Performance 2024 in Hamburg, Germany, informing that nine new supercomputers from around the globe are now powered by its Grace Hopper Superchips. As a result, these supercomputers can provide a combined computing power of 200 exaflops (200 quintillion calculations per second). According to Nvidia, this is twice as energy-efficient as an x86 system plus GPU.

The director of Nvidia’s accelerated data center GTM, Dion Harris, during a media briefing stated that Grace Hopper represents 80% of Hopper sales. He emphasized the excitement around this achievement, attributing it to the unique architecture that integrates the CPU and GPU, thus offering superior performance for HPC and AI.

The Swiss National Supercomputing Centre hosts the first European Grace Hopper supercomputer, known as Alps. Constructed by Hewlett Packard Enterprise (HPE), it delivers 20 exaflops of AI computing power, driven by 10,000 Grace Hopper superchips. Its main role is to contribute to advances in weather and climate modeling, and material science.

In addition, Nvidia broadcasted that the open-source Nvidia CUDA-Q platform will soon augment the performance of national supercomputing centers across the world. The company unveiled plans of deployment in centers in Germany, Japan, and Poland to enhance the power of quantum processing units (QPU) in their high-performance computing systems.

“Quantum accelerated supercomputing, where quantum processors are built into superior supercomputers, holds incredible potential to tackle scientific problems that would otherwise be intractable,” remarked Tim Costa, the director of Quantum and HPC at Nvidia. “However, we face several hurdles today before we can realize the practical utility of quantum accelerated supercomputing. Today’s qubits are noisy and susceptible to errors. The integration with HPC systems is yet to be addressed. Error correction methodologies and infrastructure need to be built. Additionally, we need to invent algorithms that can exponentially speed up operations, along with overcoming a range of other challenges.”

In an effort to address these challenges, Costa mentioned that over 25 national quantum projects have been kicked off. There are now over 350 quantum startups and over 70% of Fortune 500 companies have established a sort of quantum program, and more than 48,000 quantum research papers have been published.

“Nevertheless, another vast arena in quantum computing remains unexplored,” Costa pointed out. “That’s the deployment of quantum accelerated supercomputers – supercomputers that incorporate a quantum processor to execute certain tasks optimally suited to quantum operations, with the support and collaboration of AI supercomputing. We are extremely delighted to usher in the world’s first quantum accelerated supercomputers.”

These groundbreaking machines will be located at AIFST in Japan, Jülich in Germany, and PSNC in Poland, where two QPUs have been set up.

The integration of four quantum processing units with three supercomputers marks a new era in quantum innovation, as per Heather West, who is a research manager for quantum computing in the infrastructure systems, platforms, and technologies group at IDC. Quantum computing has always held promise for boosting scientific advantage, but the mutually beneficial coexistence of quantum and classical computing technologies will also speed up the advancement of quantum systems. This acceleration paves the way for effective, error-corrected, quantum-focused supercomputers and the era of quantum utility, a much-anticipated endpoint for both quantum researchers and end users.

However, Harris says that the application of AI models is crucial for this to succeed. Fault-tolerant systems that don’t employ AI models for large-scale, real-time error correction to calibrate these devices will likely not be successful. As of now, it’s an extremely labor-intensive task for physicists to calibrate and maintain a quantum device. This complexity will only increase as the number of qubits rises, making it necessary to automate these processes and apply the best technology and AI to handle these tasks.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Article

Reflecting on One Year of 'The Magic Of Zelda: Tears Of The Kingdom' and its Experimentation Magic

Next Article

Why Generative AI Doesn't Simplify Hardware Challenges

Related Posts