BEITs Blog
NVIDIA GTC 2025: Looking Toward the Hybrid Future of Computing
Mar 23, 2025
We attended NVIDIA GTC 2025 and were inspired by the convergence of quantum computing, AI, and HPC as the next wave in advanced computation. Key takeaways included using quantum processors alongside classical systems (like GPUs) rather than trying to replace them, and the progress companies like IonQ, PsiQuantum, and others have made in real-world quantum applications. Meanwhile, AI innovations (including new models and hybrid platforms) are accelerating at a rapid pace, offering huge possibilities for fields like drug discovery. Despite the challenges ahead, the conference made it clear that a hybrid approach, blending quantum, HPC, and AI, is paving the way for breakthroughs across industries.
Using NVIDIA CUDA-Q for Hamiltonian Simulation
Mar 20, 2025
This article showcases BEIT’s implementation of electronic Hamiltonian simulation circuits using NVIDIA’s CUDA-Q, with a focus on accelerating and validating quantum phase estimation for molecular systems. By refining the double-factorization method introduced by Burg et al., BEIT significantly reduced circuit sizes and demonstrated faster and more scalable simulations-achieving up to a 600% speedup over Qiskit Aer in certain benchmarks. The blog highlights how CUDA-Q’s statevector and tensornet-mps backends efficiently handle up to 155 qubits on RTX, A100, and H100 GPUs, enabling rigorous testing of advanced quantum algorithms (including QROM block-encoding schemes) and facilitating early-stage evaluation of complex molecular simulations relevant to industrial chemistry.