IBM has just completed its annual Quantum Summit. The company has completed several milestones of the Quantum roadmap and has set itself a new challenge. During these summits, IBM informs the industry of its ongoing efforts to make quantum computing a key part of the future of computing and sets goals for future developments. The theme for the summit was “The Next Wave”, as IBM believes quantum is rapidly approaching an inflection point. The summit was held this year in downtown Manhattan and brought together many of IBM’s business partners, including Boeing, Bosch, Vodaphone and many more. It was also well attended by academics and government researchers.
Key to the acceptance and maturation of quantum computing as an industry will be building a healthy ecosystem and set of partnerships. IBM’s Quantum Network now has more than 200 members. The company added new quantum innovation centers at Arizona State University, DESY, IIT Madras, and the recently opened uptownBasel Innovation Center in Switzerland. Its latest industrial partners are Bosch, Crédit Mutuel Alliance Fédérale, Erste Digital, Tokyo Electron, HSBC and Vodafone. IBM offers several plans for its quantum program: including an open plan, a pay-as-you-go plan, as well as a premium plan. The company also offers a Quantum Accelerator program with resources to support businesses at any point on their journey to quantum readiness.
One of the new partners added to the IBM Quantum Network is multinational telecommunications company Vodafone, which is exploring quantum computing and quantum cryptography as a lead partner under the 3GPP consortium. Another partner is the French bank Crédit Mutuel which is exploring use cases in financial services. In addition, uptownBasel provides skills development and promotes leading innovation projects on quantum and high-performance computing technology.
IBM made 12 major announcements at the summit, which is hard to condense into one story. A key hardware announcement was Osprey, IBM’s new 433 quantum bit (qubit) quantum processing unit (QPU) and the world’s largest superconducting quantum processor.
Overall, IBM makes continuous improvements to quantum computers: scale (more qubits), quality (increased quantum volume taking into account coherence time and error rates), and speed (CLOP or layer operations circuit per second). The number of qubits triples with the introduction of the Osprey 433-qubit QPU. There’s a 4x improvement in Quantum Volume from 128 to 512. And there’s a 10x improvement in CLOPS from 1.4k to 15k – beating IBM’s target of 10k CLOPS by 50% – both on its Falcon chips.
IBM has also released full access to quantum dynamic circuits, those that integrate classical computing over the life of the circuit to perform a richer array of circuit operations. Dynamic circuits, among other benefits, will dramatically reduce the length of some quantum circuits, making it more practical to design useful quantum circuits.
To solve the scaling problems of the multi-quantum system, IBM needed a new cryostat design. The company had previously announced the IBM Quantum System Two design, but as it got closer to implementation, the artist’s design changed and a new design was revealed at the summit. With IBM Quantum System Two, multiple cryostats can be positioned next to each other to form a single system with communication links. IBM plans to bring System Two online by the end of 2023. With the next-generation system, IBM plans to build the next wave of quantum computing and will integrate middleware for quantum with quantum and classical workflows in a multicloud environment.
With the development of quantum computing technology, there is also the threat that it could be used as a weapon. Specifically, quantum computing has a good chance of advancing to the point where it can crack public-key encryption protocols using Shor’s algorithm. This future threat is also addressed by the US National Institute of Standards and Technology (NIST) standards body in its next-generation encryption standards. IBM reminded the public that it has already rolled out support for these next-generation standards in its latest IBM Z16 mainframe.
IBM Fellow and Vice President of Quantum Jay Gambetta ended the list of announcements with a challenge he called “100×100”. The quote in his blog post: “In 2024, we plan to deliver a tool capable of computing unbiased observables of circuits with 100 qubits and 100 gate depth operations in a reasonable runtime. We are confident in our ability to deliver this tool through Heron: if we can build a Heron processor with error rates below the “three nines” gate fidelity threshold, plus the software infrastructure to read the circuits together with conventional resources, then we can run a 100×100 circuit in less than a day and produce unbiased results. This system will be able to run quantum circuits with complexity and runtime beyond the capabilities of today’s best classical computers. “At this threshold, IBM believes it will be able to demonstrate that quantum computers can solve problems that are not practical on classical computers, often referred to as the Quantum Advantage. Having QPUs with more than 100 qubits nearly error-free allows the implementation of deeper circuits and therefore more complex operations.
Even though the concept of a quantum computer has been in development for decades, it’s only recently that we’ve reached the point where we have enough qubits to make things interesting. Much of it is still a science project. The tools are more and more sophisticated and more and more accessible. But it’s still an area looking for problems to solve. Moreover, the pioneers of quantum computing still speak in terms that are not easily understood by traditional programmers. We are pretty much in the position where neural networks and deep learning processing were about ten years ago. We may be seeing a similar inflection point for quantum computing as it expands its capabilities. The real difference is that you can run AI processing on a huge range of computing devices, from microcontrollers to supercomputers. Superconducting quantum computing requires specialized equipment that can only be supported at data center scale. There are other types of quantum computing devices that may find applications in certain fields, but for the highest speed and capacity, IBM has bet on superconducting qubits.
IBM is committed to quantum computing because, as Dario Gil, the company’s head of research, told the audience at the start of the event, there are three general areas of computing: bits (classical), neurons (IA) and qubits (quantum). AI and quantum will still require control and connections to classical computing and won’t replace ordinary bits anytime soon. But both AI and quantum can provide specialized functions that are difficult or even impractical with classical computing. While AI has already established itself in the mainstream market, quantum is still a very nascent technology. There are analogies between AI and quantum. For example, both provide probabilistic (stochastic) results – the AI provides a probabilistic correct answer and the quantum provides a probability distribution. AI’s journey to gain mainstream acceptance didn’t happen overnight, but once it was recognized as a powerful tool, applications sprang up everywhere. Quantum is still on the path to acceptance and recognition of its power. Quantum computing is still looking for its niche. The application of quantum to real-world problems may be more limited today, but TIRIAS Research believes that quantum computing will help solve some of the toughest problems that classical computing can only approximate and that the AI can only guess.
Tirias Research follows and advises companies across the entire electronics ecosystem, from semiconductors to systems and sensors to the cloud. Members of the Tirias research team have consulted for IBM and other companies in the security, AI and quantum ecosystems.
#IBM #prepares #quantum #computing #inflection #point