For the first time ever, scientists at Paderborn University have used high-performance computing (HPC) at large scales to analyze a quantum photonics experiment. In specific terms, this involved the tomographic reconstruction of experimental data from a quantum detector. This is a device that measures individual photons.
The researchers involved developed new HPC software to achieve this. Their findings have now been published in the journal Quantum Science and Technology.
Quantum tomography on a megascale photonic quantum detector
High-resolution photon detectors are increasingly being used for quantum research. Precisely characterizing these devices is crucial if they are to be put to effective use for measurement purposes—and thus far, doing so has been a challenge. This is because it involves huge volumes of data that need to be analyzed without neglecting their quantum mechanical structure.
Suitable tools for processing these data sets are particularly important for future applications. While traditional approaches cannot perform like-for-like computations of quantum systems beyond a certain scale, Paderborn’s scientists are using high-performance computing for characterization and certification tasks.
“By developing open-source customized algorithms using HPC, we perform quantum tomography on a megascale quantum photonic detector,” explains physicist Timon Schapeler, who authored the paper with computer scientist Dr. Robert Schade and colleagues from PhoQS (Institute for Photonic Quantum Systems) and PC2 (Paderborn Center for Parallel Computing).
PC2, an interdisciplinary research project at Paderborn University, operates the HPC systems. The university is one of Germany’s national high-performance computing centers and thus stands at the forefront of university high-performance computing.
‘Unprecedented scale’
“The findings are opening up entirely new horizons for the size of systems being analyzed in the field of scalable quantum photonics. This has wider implications, for example, for characterizing photonic quantum computer hardware,” Schapeler continues. Researchers were able to perform their calculations for describing a photon detector within just a few minutes—faster than ever before.
The system also managed to complete calculations involving huge quantities of data extremely quickly. Schapeler states, “This shows the unprecedented scale on which this tool can be used with quantum photonic systems. As far as we know, our work is the first contribution to the field of traditional high-performance computing enabling experimental quantum photonics at large scales.
“This field will become increasingly important when it comes to demonstrating quantum supremacy in quantum photonic experiments—and on a scale that cannot be calculated by conventional means.”
Shaping the future with fundamental research
Schapeler is a doctoral student in the “Mesoscopic Quantum Optics” research group headed by Professor Tim Bartley. This team conducts research into the fundamental physics of the quantum states of light and its applications. These states consist of tens, hundreds or thousands of photons.
“The scale is crucial, as this illustrates the fundamental advantage that quantum systems hold over conventional ones. There is a clear benefit in many areas, including measurement technology, data processing and communications,” Bartley explains.
More information:
Timon Schapeler et al, Scalable quantum detector tomography by high-performance computing, Quantum Science and Technology (2024). DOI: 10.1088/2058-9565/ad8511
Provided by
Universität Paderborn
Citation:
Researchers use high-performance computing to analyze a quantum photonics experiment (2024, October 24)
retrieved 25 October 2024
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.