Quantum computing has made significant strides over the past five years, but we still have a long way to go before achieving scalable "utility quantum computing."
This refers to a state where quantum computers have thousands of qubits, making them robust enough to quickly solve complex problems that would take classical binary computers centuries to handle.
Quantum computers are highly complex and extremely sensitive to external disturbances or "noise" such as heat, magnetic fields, cosmic radiation, and stray light. These factors disrupt the delicate quantum environment, leading to errors and de-coherence of the quantum state. To minimize errors, a significant portion of quantum computers is dedicated to protecting qubits, allowing the quantum state to persist as long as possible.
Error correction in quantum computing is a major challenge. While error protection is common in everyday technologies like telecoms and data centers, quantum error correction is much more complex and likened to juggling with loose soot while trying to herd cats. Logical qubits, sets of physical qubits operating together, offer a potential solution, but constructing and managing them is extremely difficult.
A single logical qubit might require around 1,000 or more physical qubits, with most dedicated to identifying and correcting errors in real-time, leaving only a few qubits for computational processing. This leads to significant overhead and energy consumption.
Efforts to overcome the problem of quantum error correction are ongoing, as is the race to scale up quantum computers to thousands of qubits while maintaining high coherence and minimizing error rates. Additionally, with quantum and classical computers coexisting, the focus is on optimizing data transfer between the two technologies, enabling practical, complementary, and compatible applications.
To achieve this, standards and protocols for hardware, software, applications, and communication interfaces need to be developed to facilitate interoperability between different quantum computing platforms. Benchmarking standards will also be essential for measuring and comparing the performance of quantum computers.
Lawrence Gasman, a quantum computing expert, highlighted several other challenges in a recent interview. One major hurdle is the lack of a unified approach to developing scalable, fault-tolerant qubit control technology, given the diversity of qubit technologies used in quantum computing. The software side also faces difficulties, with the need to develop new programming languages and compilers and the infancy of quantum algorithms.
Furthermore, the quantum computing field suffers from a shortage of trained quantum scientists and engineers globally, adding to the immense costs of the enterprise. However, despite these challenges, Gasman remains optimistic about the increasing number of applications emerging from quantum computing.
He cited drug discovery, materials design, quantum chemistry, and financial services as sectors where quantum computers are making progress. As quantum devices move from hundreds to thousands of qubits, they are capable of handling highly advanced tasks, offering optimal solutions through simulations of billions of system scenarios.
Gasman believes that as the cost of quantum computing decreases, the technology will become more accessible to smaller organizations. Eventually, we may see the emergence of end-user mini-quantum computers, heralding a transformative era much sooner than anticipated.