Since 1987 – Covering the Fastest Computers in the World and the People Who Run Them
Since 1987 – Covering the Fastest Computers in the World and the People Who Run Them
October 12, 2022
Most talk about quantum computing today, at least in HPC circles, focuses on advancing technology and the hurdles that remain. There are plenty of the latter. Financial services giant JPMorgan Chase (JPMC) takes a different, distinctly user perspective, generally steering clear of the qubit technology battles and instead focusing on becoming being quantum-ready now. Quantum information science, believes JPMC, isn’t a nice-to-learn area but a must-learn. QIS will upend many existing practices and introduce new ones.
No doubt having resources of its scale ($129 billion in 2021 revenue) helps fund JPMC’s wide-ranging technology research. In the quantum area JPMC has been busily developing quantum algorithms around optimization, machine learning, natural language processing and publishing the results. Leading this effort is Marco Pistoia, a former distinguished IBM researcher who joined JPMC in 2020 as the managing director of JPMC’s Global Technology Applied Research Center (brief bio at end of article).
Pistoia presented at Tabor Communication’s annual HPC + AI Wall Street conference held last month. While his comments were focused on financial services, they also were representative of perspectives and actions being taken by potential QIS users now. These companies don’t care what the underlying quantum computing system is. They will use whatever systems become available and are tightly focused on learning how to wring competitive advantage from them. JPMC has worked with different qubit modalities, trapped ions as an example, in its work.
“At JPMorgan Chase, we don’t do research in isolation. We want to get the problems, the use cases, from the company. But at the same time, we publish our results because we want to contribute and this is very important. [Quantum computing] is not yet at the stage in which it can be used in production. In fact, it cannot be used in production today,” said Pistoia. “The quantum computers are not yet powerful enough. We are at the scientific stage and when we are in a scientific stage with a certain technology, that’s the best moment to actually collaborate with other companies and publish our results.”
JPMC has been doing just that and a list of some of its recent papers are included at the end of the article. As Pistoia emphasized in his talk, “For people who work at the level of algorithms and applications, the physics of the quantum computer is not so important. We don’t need to understand how qubits work physically, we just need to know the mathematics of the qubits. There are a lot of opportunities for teams to grow with skills like math, physics, of course, is always welcome, and computer science. We have actually a very diverse team, in terms of skills and background.”
Presented here is a brief summary of Pistoia’s talk along with some of his slides.
Why Bother with Quantum?
Without digging into the mysteries of superposition and entanglement, Pistoia noted simply that quantum promises dramatic speed-up and accuracy improvements in optimization, simulation, and machine learning. And interestingly, despite expectations that science areas such as physics and chemistry would be the first to benefit from quantum computing, it turns out financial services will perhaps be first to take advantage of quantum computing said Pistoia.
“Why is that the case? The reason is that in finance, we have a many use cases [that] have exponential complexity. [The] level of complexity explodes as soon as a dataset becomes big enough and a classical computer cannot solve that problem anymore,” said Pistoia.
The current approach to solving these types of problems in all industries is using approximation techniques. “With approximations, we don’t have the exact answer. Also, in finance, time is of the essence. Unlike other industry sectors where you can afford a little bit more time; [for example] the pharmaceutical industry can afford to run a computer for three days and then get the recipe for a new drug or a new vaccine. [In that case], I think three days is perfectly reasonable. In finance, we need to get answers right away. Because the market is quickly changing. A computation that takes three days is totally useless,” he emphasized.
Pistoia cut through the haze with a simple crystal clear example.
“Suppose you have 10 people to sit around a dinner table. It’s surprising to think that you have 3,628,800 possible ways to seat just ten people around a table. Think about how many combinations you can have. Of course, there are also other things to consider like constraints. For example, you want to maximize the chances that people who like each other are sitting next to each other, and people who don’t like each other [will be] sitting far apart one from the other. This problem is very difficult and that’s with just 10 people. Imagine when we do portfolio optimization and we have not 10 assets in a portfolio but thousands of assets. This number [of possible combinations] is gigantic.”
Sufficiently reliable quantum computers – leveraging superposition and entanglement – could solve these problems much faster and more accurately. The current crop of quantum computers varies widely in technology used, size and performance but all of them, are generally lumped into the NISQ (noisy, intermediate scale quantum) computer category. NISQ systems have many drawbacks (qubit count, error rates, speed, etc.) and have so far proven unready for more than proof-of-concept use. Expectations are that higher qubit-counts and improved error mitigation/correction will lead to better NISQ machines able to tackle a few narrow applications. Universal, fault-tolerant computers are much further off, perhaps a decade.
It’s the Algorithms!
Even with sufficiently reliable quantum computers, solving these problems isn’t easy. Potential users, such as JPMC, are digging into the use cases and the new and modified algorithms that will be needed for quantum computers. Pistoia reviewed recent work by JPMC on several algorithms. Here are two examples that are also depicted on the slide below (click on slide to enlarge it):
These examples are quantum algorithms run on quantum computers. A different problem occurs at the application layer when input from the application must be converted into appropriate input for the quantum circuit.
Pistoia said, “Let’s go back to NISQ-HHL. It is an algorithm that solves the systems of linear equations. So we need some logic that takes a portfolio’s input and transforms it into a system of linear equations so that NISQ-HHL can solve [it]. For a risk analysis and option pricing, we needed a logic as well that takes the inputs for a risk analysis problem or a delivery pricing problem and makes the inputs suitable for the algorithm. [We realized] this logic sometimes can itself be the bottleneck – we were happy to have the algorithm that reduced the complexity by cutting the exponent in half, but realized the logic on top of the algorithm was becoming the bottleneck, negating the quantum advantage of the algorithm. So we had to create another algorithm for the input.”
Returning to the emphasis on uses cases, Pistoia said, “The important thing that I wanted to emphasize today is these are the real problems of the bank. The algorithms that we designed and we implement today are the algorithms that we will use in the future when quantum computers become capable of running in production. The only thing that we are not doing is we’re not able to digest the real datasets that the bank is using every day. So [in that sense] we’re not really solving a portfolio optimization problem that the bank is facing today, because the quantum computers are not yet big enough,” said Pistoia.
“But that’s okay. We know that. We just need to wait for the hardware to progress. Meanwhile, we’re not idle, waiting for this hardware to make progress. That’s a very important point. I think it’s important for the financial industry as a whole to realize that if a company doesn’t do anything about quantum right now, just waiting for quantum advantage to become a reality, when quantum advantage becomes real, it might be too late to catch up. Other companies will already be there,” he said
Dealing with the Post-Quantum Security Headache
No FS presentation would be complete without a discussion of the threat and opportunity quantum computing presents to data security. The threat, of course, is that when fault-tolerant universal quantum computers arrive, they are expected to be able to decrypt data that has been encrypted using current methods thanks to Shor’s algorithm. (See HPCwire coverage, The Race to Ensure Post Quantum Data Security)
NIST has cautioned against so-called ‘Harvest Now; Decrypt Later’ attacks in which bad actors capture data now and store it until later – perhaps years – when it can be decrypted by quantum computers. Financial services companies, including JPMC, are all scrambling to be ready to cope and NIST issued its first new post-quantum algorithms this summer.
“People were not super worried about it, because at the beginning, initial estimation showed that it would take a billion qubits to break a public key in one day. We’re very far away from having a billion qubits. So it sounded like we can relax a little bit. However, later, it was actually shown that you can do the same thing, you can compute somebody’s private key from the corresponding public key with only 20 million qubits – a big difference from 1 billion to 20 million, and that you will be able to do that in eight hours. This year, researchers showed that it will take only 13,436 cubits to break a cryptography system with a public and private key in 177 days, which basically is less than six months. So, you know, six months is nothing because our public and private keys definitely have a life that is much longer than that,” said Pistoia.
“Now, how far are we from these 13,436 qubits, which will be able to break cryptography in less than six months? We’re not that far away. Because if we look at the roadmaps of some companies that are leading in quantum hardware, we see that a number of qubits of that caliber that also incorporate error correction, which is another important thing that quantum computers must have, may actually be available in 2026. So I’m not saying that in 2026, our cryptography will be broken. But it might. Maybe will be later. But I think it doesn’t hurt to be cautious and think about the fact that this is going to happen at a certain point.”
Pistoia noted preparing for post-quantum cryptography is a huge effort. Indeed, many observers have noted how encryption/decryption functions have been scattered throughout IT infrastructure and that just identifying where the code exists in legacy systems is a challenge. There’s a growing call for a complete overhaul that is more modular and allows companies to readily find and swap in new encryption algorithms as they become needed.
Perhaps not surprisingly, Pistoia concluded with a call for action. He argued that waiting for quantum advantage to arrive isn’t a good approach.
“I wanted to conclude with this call for action. I think there are three things that every company should do. I mean, this is something that I have seen firsthand at JPMorgan. The first thing I mentioned before: becoming quantum ready is definitely crucial. Because we are in a very privileged moment. We know that quantum computing is arriving, but it’s not there yet. Perfect. This is the time for us to learn about it, and be ready for when quantum advantage becomes a reality.
“Now, when quantum advantage comes, it will not come at the same time. For every application, remember that I said before that for portfolio optimization we were able to achieve a dramatic speed up I called exponential. And then for option pricing, we only were able to cut the exponent in half. I think that gives an idea about the fact that some applications are going to benefit from quantum computing before others. It makes sense to identify these, like low-hanging fruit to this first wave of applications and start from those.
“And another thing that I saw is that it’s crucial to build a team because quantum computing is a very specialized technology [and] it’s crucial to have a team that understands quantum computing. It’s also important that this team is not totally centralized in the sense that it cannot work in isolation but must work with the rest of the company.”
Recent Papers
https://arxiv.org/abs/2206.06290
Constrained Quantum Optimization for Extractive Summarization on a Trapped-ion Quantum Computer
https://arxiv.org/abs/2110.15958
NISQ-HHL: Portfolio Optimization for Near-Term Quantum Hardware
https://arxiv.org/abs/2009.06601
The Efficient Preparation of Normal Distributions in Quantum Registers
https://arxiv.org/abs/1905.02666
Option Pricing using Quantum Computers
https://arxiv.org/abs/2210.03210
Universal Quantum Speedup for Branch-and-Bound, Branch-and-Cut, and Tree-Search Algorithms
https://arxiv.org/pdf/2111.05451.pdf
Importance of Kernel Bandwidth in Quantum Machine Learning
https://arxiv.org/abs/2206.06686
Bandwidth Enables Generalization in Quantum Kernel Models
Brief Bio of Pistoia
Marco Pistoia, Ph.D. is Managing Director, Distinguished Engineer, and Head of JPMorgan Chase’s Global Technology Applied Research (formerly Future Lab for Applied Research and Engineering), where he leads research in Quantum Computing, Quantum Communication, Cloud Networking, Augmented and Virtual Reality (AR/VR), Internet of Things (IoT) and Blockchain and Cryptography. He joined JPMorgan Chase in January 2020. Formerly, he was a Senior Manager, Distinguished Research Staff Member and Master Inventor at the IBM Thomas J. Watson Research Center in New York, where he managed an international team of researchers responsible for Quantum Computing Algorithms and Applications. He is the inventor of over 250 patents, granted by the U.S. Patent and Trademark Office, and over 300 patent-pending applications. Over 40 of his patents are in the area of Quantum Computing.
Dr. Pistoia received his Ph.D. in Mathematics from New York University in May 2005. He is the lead author of two printed books: Enterprise Java Security (published by Addison-Wesley in English and by Tsinghua University Press in Chinese) and Java 2 Network Security (published by Prentice Hall), both used as textbooks in many universities worldwide. He is also a coauthor of the online textbook Learn Quantum Computation using Qiskit, published in 2020.
More Off The Wire
Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!
October 12, 2022
Most talk about quantum computing today, at least in HPC circles, focuses on advancing technology and the hurdles that remain. There are plenty of the latter. Financial services giant JPMorgan Chase (JPMC) takes a differ Read more…
October 11, 2022
The launch of ESnet6 was announced at an event at Berkeley Lab this morning. ESnet – short for “energy sciences network” – is managed by Berkeley Lab, funded by the DOE’s Office of Science and provides high-spe Read more…
October 11, 2022
Quantum computers are being tested at some of the world’s top high-performance computing centers, which are wading through different systems and approaches to find the best fit for their infrastructure. A battle for Read more…
October 10, 2022
Tsunamis can be truly devastating: in 2004, a single tsunami killed nearly 230,000 people — so predicting when and how tsunamis occur is a crucial task. Now, a team of researchers have won the 2022 DesignSafe Dataset A Read more…
October 10, 2022
The AGH University of Science and Technology has inaugurated its Athena supercomputer. Athena, installed last year, delivers nearly 8 peak petaflops of computing power. The system is intended to serve general research pu Read more…
Climate technology leader BlocPower wanted to build a powerful, cost-effective data processing pipeline so that it could process over 100 million energy profiles of buildings and better understand how it can optimize energy efficiency across the United States. Read more…
Insurance is a highly regulated industry that is evolving as the industry faces changing customer expectations, massive amounts of data, and increased regulations. A major issue facing the industry is tracking insurance fraud. Read more…
October 6, 2022
In this regular feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…
October 12, 2022
Most talk about quantum computing today, at least in HPC circles, focuses on advancing technology and the hurdles that remain. There are plenty of the latter. F Read more…
October 11, 2022
The launch of ESnet6 was announced at an event at Berkeley Lab this morning. ESnet – short for “energy sciences network” – is managed by Berkeley Lab, f Read more…
October 11, 2022
Quantum computers are being tested at some of the world’s top high-performance computing centers, which are wading through different systems and approaches to Read more…
October 10, 2022
The AGH University of Science and Technology has inaugurated its Athena supercomputer. Athena, installed last year, delivers nearly 8 peak petaflops of computin Read more…
October 6, 2022
Intel is opening up its fabs for academic institutions so researchers can get their hands on physical versions of its chips, with the end goal of boosting semic Read more…
October 5, 2022
For the better part of a century, General Motors (GM) was the biggest automaker in the world. Now, amid a paradigm shift toward smarter, electrified vehicles, t Read more…
October 4, 2022
Last week the Quantum Economic Development Consortium (QED-C) released a new report – Public-Private Partnerships in Quantum Computing – that calls for incr Read more…
September 29, 2022
Intel’s engineering roots saw a revival at this week’s Innovation, with attendees recalling the show’s resemblance to Intel Developer Forum, the company’s ann Read more…
September 23, 2022
Nvidia is not interested in bringing software support to its GPUs for the RISC-V architecture despite being an early adopter of the open-source technology in its GPU controllers. Nvidia has no plans to add RISC-V support for CUDA, which is the proprietary GPU software platform, a company representative… Read more…
August 30, 2022
It is perhaps not surprising that the big cloud providers – a poor term really – have jumped into quantum computing. Amazon, Microsoft Azure, Google, and th Read more…
July 19, 2022
The U.S. Senate on Tuesday passed a major hurdle that will open up close to $52 billion in grants for the semiconductor industry to boost manufacturing, supply chain and research and development. U.S. senators voted 64-34 in favor of advancing the CHIPS Act, which sets the stage for the final consideration… Read more…
August 22, 2022
Amid the high-performance GPU turf tussle between AMD and Nvidia (and soon, Intel), a new, China-based player is emerging: Biren Technology, founded in 2019 and headquartered in Shanghai. At Hot Chips 34, Biren co-founder and president Lingjie Xu and Biren CTO Mike Hong took the (virtual) stage to detail the company’s inaugural product: the Biren BR100 general-purpose GPU (GPGPU). “It is my honor to present… Read more…
August 16, 2022
Tesla has revealed that its biggest in-house AI supercomputer – which we wrote about last year – now has a total of 7,360 A100 GPUs, a nearly 28 percent uplift from its previous total of 5,760 GPUs. That’s enough GPU oomph for a top seven spot on the Top500, although the tech company best known for its electric vehicles has not publicly benchmarked the system. If it had, it would… Read more…
June 21, 2022
Additional details of the architecture of the exascale El Capitan supercomputer were disclosed today by Lawrence Livermore National Laboratory’s (LLNL) Terri Read more…
July 1, 2022
HPCwire takes you inside the Frontier datacenter at DOE’s Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tenn., for an interview with Frontier Project Direc Read more…
June 15, 2022
AMD is getting personal with chips as it sets sail to make products more to the liking of its customers. The chipmaker detailed a modular chip future in which customers can mix and match non-AMD processors in a custom chip package. “We are focused on making it easier to implement chips with more flexibility,” said Mark Papermaster, chief technology officer at AMD during the analyst day meeting late last week. Read more…
June 16, 2022
The long-troubled, hotly anticipated MareNostrum 5 supercomputer finally has a vendor: Atos, which will be supplying a system that includes both Nvidia and Inte Read more…
August 2, 2022
The Universal Chiplet Interconnect Express (UCIe) consortium is moving ahead with its effort to standardize a universal interconnect at the package level. The c Read more…
September 2, 2022
Fusion, the nuclear reaction that powers the Sun and the stars, has incredible potential as a source of safe, carbon-free and essentially limitless energy. But Read more…
June 22, 2022
You may recall that efforts proposed in 2020 to remake the National Science Foundation (Endless Frontier Act) have since expanded and morphed into two gigantic bills, the America COMPETES Act in the U.S. House of Representatives and the U.S. Innovation and Competition Act in the U.S. Senate. So far, efforts to reconcile the two pieces of legislation have snagged and recent reports… Read more…
September 8, 2022
The steady maturation of MLCommons/MLPerf as an AI benchmarking tool was apparent in today’s release of MLPerf v2.1 Inference results. Twenty-one organization Read more…
August 3, 2022
After two-plus years of contentious debate, several different names, and final passage by the House (243-187) and Senate (64-33) last week, the Chips and Science Act will soon become law. Besides the $54.2 billion provided to boost US-based chip manufacturing, the act reshapes US science policy in meaningful ways. NSF’s proposed budget… Read more…
July 7, 2022
There’s a growing interest among silicon providers backing RISC-V to introduce 48-bit computing in custom chips to meet their specific requirements. The 48-bit long instructions focus is more as a middle ground between 32-bit and 64-bit, which has largely been the focus of chips and instruction sets until now. Read more…
August 12, 2022
Courtesy of the schedule for the SC22 conference, we now have our first glimpse at the finalists for this year’s coveted Gordon Bell Prize. The Gordon Bell Pr Read more…
© 2022 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.
