Researchers in Norway and Denmark have developed a new method to measure how quickly quantum computers lose information, a key obstacle in building stable systems. The study, led by the Norwegian University of Science and Technology and the Niels Bohr Institute, reduces measurement time from about one second to roughly 10 milliseconds. Scientists say the breakthrough allows near real-time tracking of qubit instability, helping identify the causes of information loss.
In the race to build practical quantum computers, one problem keeps resurfacing: memory that fades before it can be used.
At the center of the issue are qubits, the quantum equivalent of bits, which store and process information in quantum systems. Unlike classical bits, qubits are highly sensitive to their environment. That sensitivity allows powerful computation, but it also means information can vanish quickly.
Researchers at the Norwegian University of Science and Technology (NTNU), a public research university in Trondheim, have been working on a way to measure that loss more precisely. Their latest work, carried out with an international team led by the Niels Bohr Institute, a physics research center in Copenhagen, introduces a faster method to track how long qubits retain information.
New measurement method tracks qubit instability in real time
One of the persistent challenges in quantum computing is not just that information disappears, but that the timing of that loss is unpredictable.
“In quantum computers, information is transmitted and stored using so-called qubits,” said Jeroen Danon, a professor in the Department of Physics at NTNU. “But quantum information can quickly be lost.”
In widely used superconducting qubits, a leading architecture for current quantum processors, the average time before information decays can appear stable. Yet researchers have observed that this timescale fluctuates, sometimes significantly, over short periods.
Until now, measuring that fluctuation has been difficult. Existing methods typically required about one second to determine how long a qubit retains information. In quantum systems, where processes unfold at extremely small timescales, that delay limits visibility into rapid changes.
The new technique reduces that measurement time to approximately 10 milliseconds. That represents a speed improvement of more than 100 times, enabling what researchers describe as near real-time monitoring of qubit behavior.
With faster measurements, scientists can now observe subtle and short-lived variations in how qubits lose information. These variations were previously difficult to detect, leaving gaps in understanding the underlying physics.
It looks like a chandelier, but it’s acutally a sample holder placed at the bottom of a supercooled research machine at the Niels Bohr Institute at the University of Copenhagen. This is where the quantum computer itself is tested.
Credit-Quantum Machines
Findings could reshape quantum processor calibration
The ability to track information loss as it happens opens new possibilities for improving quantum systems.
By capturing rapid fluctuations, researchers can begin to identify the microscopic processes that cause qubits to lose coherence, the property that allows them to hold quantum information. Understanding these processes is essential for designing more stable quantum hardware.
Danon said the new method changes how scientists can calibrate and test quantum processors. Instead of relying on slower, averaged measurements, engineers can now adjust systems based on more immediate feedback.
That shift could influence how next-generation quantum devices are built and optimized. It may also help researchers compare different qubit designs and materials under more precise conditions.
Public discussion around quantum computing often reflects both optimism and caution. In a Reddit thread focused on quantum hardware challenges, user “QubitQuest,” writing on Reddit (1,200 upvotes), said, “The promise is huge, but every small instability like this shows how far we still have to go.”
A step toward more reliable quantum computers
Quantum computers are still in an experimental phase, with most systems struggling to maintain stable qubits for extended periods. The phenomenon of information loss, often referred to as decoherence, remains one of the central barriers to scaling up the technology.
Researchers say the new measurement approach does not eliminate the problem, but it provides a clearer lens to study it. With better data, scientists can refine models, test new materials, and explore error-correction techniques more effectively.
The collaboration between NTNU and the Niels Bohr Institute reflects a broader international effort to tackle these challenges. Quantum research groups across Europe, the United States, and Asia are working on similar issues, often focusing on different aspects of qubit design and control.
For now, the breakthrough offers a practical improvement. Measuring qubit memory loss faster means understanding it better. And understanding it better is a necessary step toward building quantum computers that can operate reliably outside the lab.
Also Read:
Microsoft Unveils ‘Majorana 1’ Quantum Computing Chip Amid Ongoing Skepticism
New type of supercomputer could be based on ‘magic dust’ combination of light and matter
