A team of researchers with Japan’s NTT Corporation, the Tokyo University, and the RIKEN research center have announced the development of a full photonics-based approach to quantum computing. Taking advantage of the quantum properties of squeezed light sources, the researchers expect their work to pave the road towards faster and easier deployments of quantum computing systems, avoiding many practical and scaling pitfalls of other approaches. Furthermore, the team is confident their research can lead towards the development of rack-sized, large-scale quantum computing systems that are mostly maintenance-free.
The light-based approach in itself brings many advantages compared to traditional quantum computing architectures, which can be based on a number of approaches (trapped ions, silicon quantum dots, and topological superconductors, just to name a few). However, all of these approaches are somewhat limited from a physics perspective: they all need to employ electronic circuits, which leads to Ohmic heating (the waste heat that results from electrical signals’ trips through resistive semiconductor wiring). At the same time, photonics enable tremendous improvements in latency due to data traveling at the speed of light.
Photonics-based quantum computing takes advantage of emerging quantum properties in light. The technical term here is squeezing — the more squeezed a light source is, the more quantum behavior it demonstrates. While a minimum squeezing level of over 65% was previously thought required to unlock the necessary quantum properties, the researchers achieved a higher, 75% factor in their experiments. In practical terms, their quantum system unlocks a higher than 6 THz frequency band, thus taking advantage of the benefits of photonics for quantum computing without decreasing the available broadband to unusable levels.
The researchers thus expect their photonics-based quantum design to enable easier deployments — there’s no need for exotic temperature controls (essentially sub-zero freezers) that are usually required to maintain quantum coherence on other systems. Scaling is also made easier and simplified: there’s no need to increase the number of qubits by interlinking several smaller, coherent quantum computing units. Instead, the number of qubits (and thus the performance of the system) can be increased by continuously dividing light into “time segments” and encoding different information in each of these segments. According to the team, this method allows them to “easily increase the number of qubits on the time axis without increasing the size of the equipment.”
All of these elements combined allow for a reduction in required raw materials while doing away with the complexity of maintaining communication and quantum coherence between multiple, small quantum computing units. The researchers will now focus on actually building the photonics-based quantum computer. Considering how they estimate their design can scale up towards “millions of qubits,” their contributions could enable a revolutionary jump in quantum computation that skips the expected “long road ahead” for useful qubit counts to be achieved.