Quantum Computing
IBM to Build World’s First Large-Scale, Fault-Tolerant Quantum Computer – Starling
IBM is building the world’s first large-scale quantum computer capable of operating without errors. The computer, called Starling, is set to launch by 2029. The quantum computer will reside in IBM’s new quantum data center in Poughkeepsie, New York and is expected to perform 20,000 more operations than today’s quantum computers, the company said in its announcement Tuesday.
Starling will be “fault tolerant,” meaning it would be able to perform quantum operations for things like drug discovery, supply chain optimization, semiconductor design, and financial risk analyses without the errors that plague quantum computers today and make them less useful than traditional computers.
To represent the computational state of an IBM Starling would require the memory of more than a quindecillion (1048) of the world’s most powerful supercomputers. With Starling, users will be able to fully explore the complexity of its quantum states, which are beyond the limited properties able to be accessed by current quantum computers.
IBM, which already operates a large, global fleet of quantum computers, is releasing a new Quantum Roadmap that outlines its plans to build out a practical, fault-tolerant quantum computer.
“IBM is charting the next frontier in quantum computing,” said Arvind Krishna, Chairman and CEO, IBM. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.”
A large-scale, fault-tolerant quantum computer with hundreds or thousands of logical qubits could run hundreds of millions to billions of operations, which could accelerate time and cost efficiencies in fields such as drug development, materials discovery, chemistry, and optimization.
Starling will be able to access the computational power required for these problems by running 100 million quantum operations using 200 logical qubits. It will be the foundation for IBM Quantum Blue Jay, which will be capable of executing 1 billion quantum operations over 2,000 logical qubits.
A logical qubit is a unit of an error-corrected quantum computer tasked with storing one qubit’s worth of quantum information. It is made from multiple physical qubits working together to store this information and monitor each other for errors.
Like classical computers, quantum computers need to be error corrected to run large workloads without faults. To do so, clusters of physical qubits are used to create a smaller number of logical qubits with lower error rates than the underlying physical qubits. Logical qubit error rates are suppressed exponentially with the size of the cluster, enabling them to run greater numbers of operations.
Creating increasing numbers of logical qubits capable of executing quantum circuits, with as few physical qubits as possible, is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published.
The Path to Large-Scale Fault Tolerance:
The success of executing an efficient fault-tolerant architecture is dependent on the choice of its error-correcting code, and how the system is designed and built to enable this code to scale.
Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations – necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be able to be implemented beyond small-scale experiments and devices.
A practical, large-scale, fault-tolerant quantum computer requires an architecture that is:
- Fault-tolerant to suppress enough errors for useful algorithms to succeed.
- Able to prepare and measure logical qubits through computation.
- Capable of applying universal instructions to these logical qubits.
- Able to decode measurements from logical qubits in real-time and can alter subsequent instructions.
- Modular to scale to hundreds or thousands of logical qubits to run more complex algorithms.
- Efficient enough to execute meaningful algorithms with realistic physical resources, such as energy and infrastructure.
Today, IBM is introducing two new technical papers that detail how it will solve the above criteria to build a large-scale, fault-tolerant architecture.
The first paper unveils how such a system will process instructions and run operations effectively with qLDPC codes. This work builds on a groundbreaking approach to error correction featured on the cover of Nature that introduced quantum low-density parity check (qLDPC) codes. This code drastically reduces the number of physical qubits needed for error correction and cuts required overhead by approximately 90 percent, compared to other leading codes. Additionally, it lays out the resources required to reliably run large-scale quantum programs to prove the efficiency of such an architecture over others.
The second paper describes how to efficiently decode the information from the physical qubits and charts a path to identify and correct errors in real-time with conventional computing resources.
From Roadmap to Reality:
The new IBM Quantum Roadmap outlines the key technology milestones that will demonstrate and execute the criteria for fault tolerance. Each new processor in the roadmap addresses specific challenges to build quantum computers that are modular, scalable, and error-corrected:
- IBM Quantum Loon, expected in 2025, is designed to test architecture components for the qLDPC code, including “C-couplers” that connect qubits over longer distances within the same chip.
- IBM Quantum Kookaburra, expected in 2026, will be IBM’s first modular processor designed to store and process encoded information. It will combine quantum memory with logic operations — the basic building block for scaling fault-tolerant systems beyond a single chip.
- IBM Quantum Cockatoo, expected in 2027, will entangle two Kookaburra modules using “L-couplers.” This architecture will link quantum chips together like nodes in a larger system, avoiding the need to build impractically large chips.
Media Contacts
Erin Angelini, IBM Communications
[email protected]
Brittany Forgione, IBM Communications
[email protected]
References:
IBM’s path to scaling fault tolerance, read their blog here, and watch BM Quantum scientists in this latest video
Bloomberg on Quantum Computing: appeal, who’s building them, how does it work?
Google’s new quantum computer chip Willow infinitely outpaces the world’s fastest supercomputers
Ultra-secure quantum messages sent a record distance over a fiber optic network
Quantum Computers and Qubits: IDTechEx report; Alice & Bob whitepaper & roadmap
China Mobile verifies optimized 5G algorithm based on universal quantum computer
Ultra-secure quantum messages sent a record distance over a fiber optic network
Unlike binary bit based digital communications, quantum information is transmitted in qubits, which can store multiple values at once, making quantum communications more secure. A recently published article in Nature states that scientists have sent quantum information across a record-breaking 158 miles using ordinary computers and fiber-optic cables. It’s the first time coherent quantum communication—an ultra-secure means of transmitting data—has been achieved using existing telecommunications infrastructure, without the expensive cryogenic cooling that is typically required.
“Our equipment was running alongside the fibers that we use for regular communication literally buried underneath the roads and train stations,” said Mirko Pittaluga, a physicist and lead author of the study. Pittaluga and his colleagues at Toshiba Europe sent quantum information from regular computers hooked into the telecommunications network at data centers in the German cities of Kehl and Frankfurt, relaying them through a detector at a third data center roughly midway between them in Kirchfeld. The three-location setup enabled the group to extend the distance the messages were sent more than 150 miles, an uninterrupted distance only ever achieved in a laboratory environment.
Pittaluga said that his team’s work is critical to solving the problem of keeping sensitive data out of the reach of hackers. One means of fixing this problem, Pittaluga said, is through quantum cryptography, which relies on the physics of quantum mechanics rather than mathematical algorithms to generate encryption keys. But to use quantum encryption keys, you have to successfully distribute them across meaningful distances, a task that has stymied researchers outside the lab for decades.
Quantum data was sent over an ordinary telecom network with fiber-optic cables.© julie sebadelha/Agence France-Presse/Getty Images
Integrating the technology into existing infrastructure using largely off-the-shelf equipment is a key step in expanding the accessibility of quantum communication and its use in encrypting information for more secure transmission of data, according to multiple physicists and engineers who weren’t involved in the study.
“This is about as real-world as one could imagine,” said David Awschalom, a professor of physics and molecular engineering at the University of Chicago who wasn’t a part of the new work. “It’s an impressive, quite beautiful demonstration.” Working at these types of distances, Awschalom said, means that quantum information could be sent across entire metropolitan areas or between nearby cities, making it useful for hospitals, banks and other institutions, for which secure communications are paramount.
“The likelihood of them being able to reverse engineer a quantum key, which is the number you would need to decrypt your information, is vanishingly small,” according to Awschalom.
Other groups in the U.K. and U.S., including researchers at the University of Pennsylvania, are also working on extending the distances achievable by quantum communication.
Today, bank statements, health records and other data transmitted online are protected using mathematically formulated encryption keys. These keys are the only means of unlocking the data, keeping it secure from cyber thieves. For conventional computers, breaking these keys takes an impractically long time, but quantum computers are up to the task, and as they become more powerful, encryption keys become vulnerable to attack.
“Anything meaningful that’s over the internet can be tapped, recorded and saved for the next decade, and can be decrypted years later,” according to Prem Kumar, a professor of electrical and computer engineering at Northwestern University, who wasn’t a part of the new work. “It’s what’s called harvest now and decrypt later.”
Internet and telecommunications infrastructure are based on optical fibers all over the world that carry pulses of light containing photons. Classical bits of information are sent as a single impulse of light carrying tens of millions of photons. Quantum information, stored in qubits, is sent in a package of a single photon.
Efficiently detecting single photons usually requires expensive superconducting detectors that cost on the order of hundreds of thousands of dollars. These high-efficiency sensors must be cryogenically cooled, using liquid helium, to super low temperatures below minus 454 degrees Fahrenheit, making the technology expensive and incompatible with existing infrastructure.
Pittaluga and his colleagues at Toshiba got around this by using cheaper detectors known as avalanche photodiodes, which cost just thousands of dollars and can run at or just below room temperature, like today’s traditional internet equipment.
Such detectors hadn’t been used for coherent quantum communication before, as they can be nearly an order of magnitude less efficient at detecting single photons and are affected by what is called the afterpulse effect—when the current detection is frustrated by leftover echoes from an earlier transmission. Superconducting detectors aren’t affected by afterpulsing, Pittaluga said.
To address the effect in the more practical and cost-effective photodiodes, his group employed two separate sets of the detectors, using one to read the signal and the other to remove the environmental noise from that signal. The goal of this setup is to bring us one step closer to a quantum internet, with incredibly secure information, Pittaluga added.
Yet despite this innovation, the technology remains expensive and difficult to implement compared with current encryption systems and networks—for now. “My personal view is that we’ll be seeing quantum encryption of data sets and metropolitan-scale quantum networks within a decade,” Awschalom added.
……………………………………………………………………………………………………………………………………………………………………………..
Why quantum computers are faster at solving problems:
Quantum computers are faster than traditional computers for optimization problems, such as finding the more efficient options for supply chains.
A traditional computer tries each combination individually. A quantum computer tries all combinations at once.
Source: Google Quantum AI
Peter Champelli/THE WALL STREET JOURNAL
………………………………………………………………………………………………………………………………………………………………………..
References:
https://www.nature.com/articles/s41586-025-08801-w
Google’s new quantum computer chip Willow infinitely outpaces the world’s fastest supercomputers
Quantum Computers and Qubits: IDTechEx report; Alice & Bob whitepaper & roadmap
Bloomberg on Quantum Computing: appeal, who’s building them, how does it work?
SK Telecom and Thales Trial Post-quantum Cryptography to Enhance Users’ Protection on 5G SA Network
SK Telecom and Thales Trial Post-quantum Cryptography to Enhance Users’ Protection on 5G SA Network
Research on quantum communications using a chain of synchronously moving satellites without repeaters
Google’s new quantum computer chip Willow infinitely outpaces the world’s fastest supercomputers
Overview:
In a blog post on Monday, Google unveiled a new quantum computer chip called Willow, which demonstrates error correction and performance that paves the way to a useful, large-scale quantum computer. Willow has state-of-the-art performance across a number of metrics, enabling two major achievements.
- The first is that Willow can reduce errors exponentially as we scale up using more qubits. This cracks a key challenge in quantum error correction that the field has pursued for almost 30 years.
- Second, Willow performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion (that is, 1025) years — a number that vastly exceeds the age of the Universe.
Google’s quantum computer chip, Willow. Photo Credit…Google Quantum AI
………………………………………………………………………………………………………………………………………………………………………………………………………………………..
Quantum computing — the result of decades of research into a type of physics called quantum mechanics — is still an experimental technology. But Google’s achievement shows that scientists are steadily improving techniques that could allow quantum computing to live up to the enormous expectations that have surrounded this big idea for decades.
“When quantum computing was originally envisioned, many people — including many leaders in the field — felt that it would never be a practical thing,” said Mikhail Lukin, a professor of physics at Harvard and a co-founder of the quantum computing start-up QuEra. “What has happened over the last year shows that it is no longer science fiction.”
As a measure of Willow’s performance, Google used the random circuit sampling (RCS) benchmark. Pioneered by its team and now widely used as a standard in the field, RCS is the classically hardest benchmark that can be done on a quantum computer today. You can think of this as an entry point for quantum computing — it checks whether a quantum computer is doing something that couldn’t be done on a classical computer.
Random circuit sampling (RCS), while extremely challenging for classical computers, has yet to demonstrate practical commercial applications. Image Credit: Google AI.
………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………..
Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.
Google’s assessment of how Willow outpaces one of the world’s most powerful classical supercomputers, Frontier, was based on conservative assumptions. For example, we assumed full access to secondary storage, i.e., hard drives, without any bandwidth overhead — a generous and unrealistic allowance for Frontier. Of course, as happened after we announced the first beyond-classical computation in 2019, we expect classical computers to keep improving on this benchmark, but the rapidly growing gap shows that quantum processors are peeling away at a double exponential rate and will continue to vastly outperform classical computers as we scale up.
In a research paper published on Monday in the science journal Nature, Google said its machine had surpassed the “error correction threshold,” a milestone that scientists have been working toward for decades. That means quantum computers are on a path to a moment, still well into the future, when they can overcome their mistakes and perform calculations that could accelerate the progress of drug discovery. They could also break the encryption that protects computers vital to national security.
“What we really want these machines to do is run applications that people really care about,” said John Preskill, a theoretical physicist at the California Institute of Technology who specializes in quantum computing. “Though it still might be decades away, we will eventually see the impact of quantum computing on our everyday lives.”
Sidebar –Quantum Computing Explained:
A traditional computer like a laptop or a smartphone stores numbers in semiconductor memories or registers and then manipulates those numbers, adding them, multiplying them and so on. It performs these calculations by processing “bits” of information. Each bit holds either a 1 or a 0. But a quantum computer defies common sense. It relies on the mind-bending ways that some objects behave at the subatomic level or when exposed to extreme cold, like the exotic metal that Google chills to nearly 460 degrees below zero inside its quantum computer.
Quantum bits, or “qubits,” behave very differently from normal bits. A single object can behave like two separate objects at the same time when it is either extremely small or extremely cold. By harnessing that behavior, scientists can build a qubit that holds a combination of 1 and 0. This means that two qubits can hold four values at once. And as the number of qubits grows, a quantum computer becomes exponentially more powerful. Google builds “superconducting qubits,” where certain metals are cooled to extremely low temperatures.
Many other tech giants, including Microsoft, Intel and IBM, are building similar quantum technology as the United States jockeys with China for supremacy in this increasingly important field. As the United States has pushed forward, primarily through corporate giants and start-up companies, the Chinese government has said it is pumping more than $15.2 billion into quantum research.
With its latest superconducting computer, Google has claimed “quantum supremacy,” meaning it has built a machine capable of tasks that are beyond what any traditional computer can do. But these tasks are esoteric. They involve generating random numbers that can’t necessarily be applied to practical applications, like drug discovery.
Google and its rivals are still working toward what scientists call “quantum advantage,” when a quantum computer can accelerate the progress of other fields like chemistry and artificial intelligence or perform tasks that businesses or consumers find useful. The problem is that quantum computers still make too many errors.
Scientists have spent nearly three decades developing techniques — which are mind-bending in their own right — for getting around this problem. Now, Google has shown that as it increases the number of qubits, it can exponentially reduce the number of errors through complex analysis.
Experts believe it is only a matter of time before a quantum computer reaches its vast potential. “People no longer doubt it will be done,” Dr. Lukin said. “The question now is: When?”
References:
https://blog.google/technology/research/google-willow-quantum-chip/
https://www.nytimes.com/2024/12/09/technology/google-quantum-computing.html
Quantum Computers and Qubits: IDTechEx report; Alice & Bob whitepaper & roadmap
Bloomberg on Quantum Computing: appeal, who’s building them, how does it work?
China Mobile verifies optimized 5G algorithm based on universal quantum computer
SK Telecom and Thales Trial Post-quantum Cryptography to Enhance Users’ Protection on 5G SA Network
Quantum Technologies Update: U.S. vs China now and in the future
Can Quantum Technologies Crack RSA Encryption as China Researchers Claim?
Quantum Computers and Qubits: IDTechEx report; Alice & Bob whitepaper & roadmap
Introduction:
In the last decade, the number of companies actively developing quantum computer hardware has quadrupled. Between 2022 and 2024 multiple funding rounds surpassing US$100 million have been closed, and the transition from lab-based toys to commercial product has begun. Competition is building in the quantum computing market, not only between different companies but between quantum computing technologies. The focus today has intensified on the need for logical or error-corrected qubits [1.]. The challenge ahead is to scale up hardware and increase qubit number while reducing errors as well as infrastructure demand. Leaders today have between 1 and 50 logical qubits, thousands are likely needed to provide a meaningful advantage over classical computing alternatives.
Note 1. Quantum computing is based on the use of qubits – the quantum equivalent to classical bits – the architectures available to create them vary substantially. Many are now familiar with IBM and their superconducting qubits – housed inside large cryostats and cooled to temperatures colder than deep space. Indeed, in 2023 superconducting quantum computers broke the 1000 qubit milestone – with smaller systems made accessible via the cloud for companies to trial out their problems.
However, many agree that the highest value problems – such as drug discovery – need many more qubits, perhaps millions more. As such, alternatives to the superconducting design, many proposing more inherent scalability, have received investment. There are now more than eight technology approaches meaningfully competing to reach the million-qubit milestone.
The quantum computing hardware market today has the unique property of seeing rapid growth in revenue generation despite remaining at a low technology readiness level. National laboratories and supercomputing centers are already investing in the installation of early-stage machines on premises, primarily for research but also to allow more local users the ability to ‘pay to play’. This is, in part, a result of the intensifying governmental stake in the technology – and its potential to provide significant economic and national security advantages in conjunction with quantum sensing and quantum communications. As a result, while multiple technical challenges remain, it appears that the race to commercial advantage could well be paved with gold for some. However, towards the end of the decade, as pressure mounts to deliver commercial value and return on investment – some of those leading the charge today may not necessarily prove to be the true winners in the long term.
With so many competing quantum computing technologies across a fragmented landscape, determining which approaches are likely to dominate is essential in identifying opportunities within this exciting industry. IDTechEx uses an in-house framework for quantum commercial readiness level to measure how quantum computer hardware is progressing in comparison with its classical predecessor. Furthermore, as the initial hype around quantum computing begins to cool investors will increasingly demand demonstration of practical benefits, such as quantum supremacy for commercially relevant algorithms. As such, hardware developers need to show not only the quality and quantity of qubits but the entire initialization, manipulation, and readout systems. Improving manufacturing scalability and reducing cooling requirements are also important, which will create opportunities for methodology agnostic providers of infrastructure such as speciality materials and cooling systems. By evaluating both the sector and competing quantum computing technologies, this report provides insight into the opportunities provided by this potentially transformative technology.
……………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………..
Alice & Bob, a leading innovator in fault-tolerant quantum computing, just released their whitepaper and technology roadmap titled, “Think Inside the Box: Quantum Computing with Cat Qubits.”
Key highlights of the whitepaper:
- Exponential Error Reduction: Cat qubits simplify error correction by reducing it from a 2D to a 1D problem, achieving unmatched fidelity (99.999999%) and reducing hardware requirements by up to 200x compared to traditional approaches.
- Roadmap Milestones: Alice & Bob’s plan moves from mastering single qubits to developing commercially viable quantum computers by 2030, with transformative applications across industries such as finance, healthcare, and cybersecurity.
- Quantum Advantage: Their technology positions them to deliver practical solutions to computational problems that are currently beyond the reach of classical computing.
Image Credit: Alice & Bob
………………………………………………………………………………………………………………………………………………………………………………………………
The roadmap details five key milestones in Alice & Bob’s plan to deliver a universal, fault-tolerant quantum computer by 2030:
- Milestone 1: Master the Cat Qubit Achieved in 2024 with the Boson chip series, this milestone established a reliable, reproducible cat qubit capable of storing quantum information while resisting bit-flip errors. Milestone 2: Build a Logical Qubit Currently under development with the Helium chip series, this stage focuses on creating the company’s first error-corrected logical qubit operating below the error-correction threshold.
- Milestone 3: Fault-Tolerant Quantum Computing With the upcoming Lithium chip series, Alice & Bob aims to scale multi-logical-qubit systems and demonstrate the first error-corrected logical gate.
- Milestone 4: Universal Quantum Computing The Beryllium chip series will enable a universal set of logical gates enabled by magic state factories and live error correction, unlocking the ability to run any quantum algorithm.
- Milestone 5: Useful Quantum Computing The Graphene chip series, featuring 100 high-fidelity logical qubits, will deliver a quantum computer capable of demonstrating quantum advantage in early industrial use cases by 2030, integrating into existing high-performance computing (HPC) facilities.
“Our roadmap lays out a clear path to solving quantum’s toughest engineering challenges,” said Raphael Lescanne, CTO and Co-Founder of Alice & Bob. “Quantum computing can seem opaque, but it shouldn’t be. This white paper makes our technology and roadmap accessible for engineers, business leaders and tech enthusiasts alike.”
Achieving practical quantum advantage requires overcoming the errors inherent in quantum systems. Quantum error correction typically relies on additional qubits to detect and correct these errors, but the resource requirements grow quadratically with complexity, making large-scale, useful quantum computing a significant challenge.
Alice & Bob’s cat qubits offer a promising solution to this bottleneck. These superconducting chips feature an active stabilization mechanism that effectively shields the qubits from some external errors. This unique approach has enabled cat qubits to set the world record for bit-flip protection, one of the two major types of errors in quantum computing, effectively eliminating them.
This protection reduces error correction from a 2D problem to a simpler, 1D problem, enabling error correction to scale more efficiently. As a result, Alice & Bob can produce high-quality logical qubits with 99.9999% fidelity, what they call a “6-nines” logical qubit, using a fraction of the resources required by other approaches.
“Quantum computing should be a tool for solving useful problems in science and industry. This white paper shows how Alice & Bob’s cat qubits can bring that vision to life in a practical way by the decade’s end,” said Théau Peronnin CEO and co-founder of Alice & Bob.
References:
https://alice-bob.com/products/solution-the-box/
Bloomberg on Quantum Computing: appeal, who’s building them, how does it work?
China Mobile verifies optimized 5G algorithm based on universal quantum computer
Can Quantum Technologies Crack RSA Encryption as China Researchers Claim?
Quantum Technologies Update: U.S. vs China now and in the future
AT&T will be “quantum ready” by the year 2025