< Back to 68k.news IL front page

KPIs that Drive the Resilient Quantum Computing Era

Original source (on modern site) | Article images: [1] [2]

The Quantum Computing industry is exiting the Noisy Intermediate-Scale Quantum (NISQ) era, and entering the age of Resilient Quantum Computing — a stage marked by advances that mitigate the errors hindering quantum computing systems.

Resilient Quantum Computing emphasizes not only improving hardware robustness but also enhancing error correction techniques, enabling more reliable and scalable quantum applications. This is a transformative moment for developers and researchers focusing on leveraging quantum capabilities to solve complex real-world problems.

The Key Performance Indicators (KPIs) listed in this article are crucial metrics that help evaluate quantum technologies' efficiency, performance, and progress. These indicators provide essential insights into the development, benchmarking, and commercial viability of quantum systems. Understanding these metrics is important for anyone who wants to follow the advances toward more stable and powerful quantum computers.

Multi-Qubit Gate Fidelity

This KPI measures the accuracy with which quantum gates operate on multiple qubits simultaneously. Higher fidelity values indicate fewer errors during quantum operations, which is crucial for executing complex quantum algorithms. Currently, the best performer in the industry is Quantinuum with its 99.9% 2-qubit physical gate fidelity on the H1-1 system.

Quantum Error Correction (QEC)

QEC encompasses a range of techniques and protocols designed to safeguard quantum information against errors from decoherence and other forms of noise. The effectiveness of QEC is paramount for achieving reliable quantum computation, as it enhances the system's resilience by correcting errors that occur during quantum operations. However, the success of QEC heavily relies on achieving and maintaining high gate fidelity. For QEC to work effectively, the quantum gates must operate with benchmark-level fidelity; only then can they reliably execute the error correction codes necessary to maintain the integrity of quantum information. Recent progress was made by Microsoft where QEC methods achieved an 800x decrease in error rates.

Quantum Volume

Quantum volume is a holistic measure that assesses the capabilities and error rates of a quantum computer by considering the number of qubits, connectivity, and gate fidelity. A higher quantum volume represents a more powerful and capable quantum machine. This metric was introduced commercially by IBM. The latest advances by Quantinuum set the record number of 1,048,576 in quantum volume.

Qubit Decoherence Time

Decoherence time measures the duration a qubit retains its quantum state before environmental interference causes it to lose coherence. This metric is important because a longer decoherence time allows a qubit to perform more computational steps accurately, enhancing the overall capability and reliability of quantum operations. Essentially, extending decoherence times is key to executing complex algorithms and scaling up quantum systems for practical applications.

QPU Instructions per Second

Quantum Processing Unit (QPU) Instructions per Second measures the rate at which a quantum processor can execute quantum operations. Higher values in this metric indicate a QPU's ability to perform more computations in less time, directly enhancing the efficiency and throughput of quantum computing tasks. This metric can indirectly battle suboptimal decoherence times as more instructions can be performed in a given time frame.

Algorithmic Qubit Efficiency

Algorithmic Qubit Efficiency measures the effectiveness with which quantum algorithms use qubits to solve problems, emphasizing the efficiency of quantum resource usage. This metric is particularly important given the scarcity of error-corrected logical qubits, which are robust against errors and essential for reliable quantum computation. Efficient use of these valuable resources ensures that quantum algorithms can achieve their intended outcomes with fewer qubits, maximizing the computational potential of available quantum systems. This efficiency not only extends the practical capabilities of current quantum technologies but also optimizes the performance of complex quantum tasks within the constraints of limited, high-quality qubit availability.

The KPIs outlined above play a pivotal role in shaping the future of computer science and quantum computing architectures. As quantum technologies evolve, these metrics drive the development and design of more sophisticated and robust quantum hardware and software architectures. By continuously refining these KPIs, researchers and developers will enhance the applicability of quantum systems, enabling practical quantum computing solutions (Quantum Advantage) and paving the way for the next era of Scalable Quantum Computing.

< Back to 68k.news IL front page