**THE INVENTION OF QUANTUM COMPUTING**

The basic idea can be glimpsed in a corner of fundamental physics born more than a century ago. The name reflects how they build on the principles of quantum mechanics, which was developed at the start of the 20th century to describe nature at the smallest scales of energy levels of atoms and subatomic particles and chunks, called quanta.

In the 1980s, it was realized that quantum theory had implications for computing, extending theoretical work done by the English mathematician Alan Turing in the 1930s, who came up with a mathematical portrait of a universal computer that would be capable of comporting itself like any other computer.

In 1985, the British physicist David Deutsch pointed out that, because Turing was working with classical physics, his universal computer was not so universal after all. Turing’s theory needed to be extended to quantum mechanics and Deutsch proposed on 8th July 1985 a universal computer based on quantum physics, which would have calculating powers that Turing’s classical computer (even in theory) could not simulate. Others who played a key role were American physicists Paul Benioff and Richard Feynman, and Soviet physicist Yuri Manin.

Quantum theory, which is deeply counterintuitive, states that when we look at particles we unavoidably alter them and that particles can be in two places at once, a quality called superposition. Moreover, two particles can be related, or ‘entangled,’ so that their properties are linked, regardless of their relative distance in space and time. Albert Einstein termed this ‘spooky action at a distance’ and this strange property means quantum computers can do simultaneous calculations, rather than one at a time.

However, David Deutsch points out: ‘There was a controversy a while ago about whether interference or entanglement was the basic phenomenon that gives quantum computing its power. I had always said interference and I still would. One can argue it either way and it’s kind of a philosophical question.’

**WHAT ABOUT FINANCE?**

Finance can be defined as the science of money management, a discipline almost as old as civilization itself. In a recent paper (http://arxiv.org/abs/1807.03890), the authors of this Medium article have analyzed the intriguing idea of applying quantum mechanics — and specifically quantum computing — to solve key financial problems. This analysis includes the current landscape and a list of possible quantum computing use cases in Finance. We are going to summarize the main ideas displayed there in this article.

In its deeper nature, Finance deals with uncertainty. Concepts such as interest rates and dividends arise from the impossibility of predicting the profits resulting from an investment. Imagine giving a loan to a customer: we don’t know for sure if this customer is going to repay us, or if he is going to pocket the money and disappear.

Financial practitioners (investors, bankers, etc) have traditionally two ways to cope with these situations. They can either charge a fee for the risk of granting the loan — the higher the risk, the higher the interest rate — or cope with the risk statistically, by diversifying their loans, and trying to compensate their losses by relatively large gains.

But our financial practitioner is not alone in the market: he is always competing with others. He must continuously balance the risk of losses from bad investments with the risk of going out of business by not getting enough profits from good investments. This is a tough situation, where failure can have a severe economic impact on society, as we painfully learned from the former financial crisis.

Traditionally, financial players have relayed on the power of computing to reduce the risk. This has led to a computing arms race among them, where bigger profits result from analyzing a situation faster and in more detail. This environment is ripe for quantum computers. Due to their promise of unimaginable computational power, they can impose themselves as the information processing machines that will shape the future of finance. In the following, we discuss some early attempts to tackle financial problems through quantum computing.

**QUANTUM OPTIMIZATION**

Optimization problems are at the core of many financial problems. In portfolio optimization, for instance, the problem consists in selecting the best assets to invest in to balance the risk with the expected returns. This is an NP-Hard problem, meaning it is extremely difficult — if not impossible — for classical computers to solve efficiently. Quantum annealing, on the other hand, naturally provides methods to identify better solutions for this type of problems.

In an annealing process, we encode the problem we want to solve into an equivalent physical problem. Specifically, we formulate the physical problem such that its lowest energy state is mathematically equivalent to the solution of the problem we want to solve. We then construct this system in the annealer and allow the system to naturally evolve towards its lowest energy state. Classically, the annealer converges to its lower energy state through cooling. In quantum annealing, we use quantum tunneling events to drive the system to its global minimum in a faster and more reliable fashion.

Some promising methods have already been developed for applying quantum annealing to finance:

**Optimal trading trajectory**:
the execution of large trading orders influences the price of the
underlying asset, resulting in additional execution costs for the
investor who is executing the trade. To avoid this, and to minimize the
cost of trading, algorithms are used to control the trading rate of a
large order, performing a balance between trading fast to minimize
exposure to market risk and trading slow to minimize market impact. The
discrete multi-period version of this problem has been solved on a
commercially available quantum annealer provided by D-Wave. The prospect
is that future versions of the D-Wave machine should soon be able to
handle much bigger instances of the problem, eventually overtaking
classical methods.

**Optimal arbitrage opportunities**:
the concept of arbitrage is the idea of making profit from differing
prices of the same asset in different markets, while bearing no risk.
For instance, we could change euros for dollars, dollars to yens, and
yens back to euros, and make a small profit in the process. This is
cross-currency arbitrage. While the problem of determining the optimal
arbitrage opportunity is NP-Hard, these can be detected efficiently
using quantum annealers. Specifically, solving this problem is
equivalent to finding the trajectory with the lowest cost in a directed
graph. This was solved for small sample sizes on the D-Wave 2X quantum
annealer, producing the same optimal solutions as an exhaustive
classical solver.

**QUANTUM MACHINE LEARNING**

The field of machine learning broadly amounts to the design and implementation of algorithms that can be trained to perform a variety of tasks. These include pattern recognition, data classification, and many others. We call training the process of optimizing the algorithm’s parameters to recognize specific inputs (the training data). The trained algorithm can then be applied to assess new inputs. The field of classical machine learning has grown enormously, mainly due to hardware and algorithmic developments (allowing, for instance, to train deep learning networks). The basic principles of machine learning are at the root of several vastly successful fields, the most prominent of which — today — is neural networks.

While many quantum machine learning (QML) algorithms are potentially ground breaking, many of them require access to universal quantum computers. These are more advanced than quantum annealers, and technologically more challenging. In other words, while optimization problems can already benefit from the first generation of experimental quantum annealers, the implementation of certain QML algorithms won’t be possible until the technology has developed further.At the current rate of experimental progress, however, we believe this will happen sooner rather than later.

**Data classification**

Let us return to the problem of credit scoring. The way this problem is typically dealt with relies on a machine learning method known as classification. Each customer is expressed as a vector with its coordinates representing the different characteristics of the customer. The training set is labeled, such that each vector belongs to a class (the loan risk). When given a new vector, the program must determine the class it is most likely to belong to. This area of machine learning is also at the heart of pattern recognition, which is extensively used for voice, facial recognition and, — essential in finance — for fraud detection.

Finding a new vector’s class may mean performing many high dimensional projections. This can rapidly limit the confidence with which we can assign a class to the new vector, particularly for pattern recognition applications, where the number of attributes considered is gigantic. As a result, methods for running classification algorithms on quantum computers generally focus on efficiently performing projection operations.

There are several suggestions to implement pattern recognition on quantum computers, two of which were recently realized experimentally. Additionally, an experimental implementation of support vector machines — the most used classification algorithm — was also realized on a quantum computer.

This problem is qualitatively different from pattern recognition: given a new data point, instead of attributing a class to it, we want to learn a numeric function from the training data set. This process, known as regression, is particularly useful when attempting an interpolation, and is consequently a core tool for economic forecasting — specifically for forecasting demand and predicting non-performing loans. Regression algorithms are generally used to understand how the typical value of a response variable changes as an attribute is varied.

In most cases, the optimal fit parameters are found by minimizing the least-squares error between the training data and the values predicted by the model. This is usually done by finding the inverse of the training data matrix, a task which can be extremely computationally expensive for typical datasets. There exists, however, a powerful linear algebra tool-box which allows us to diagonalize matrices on quantum computers exponentially faster than on classical computers. Alternatively, a method for running an altogether different regression algorithm, known as Gaussian process regression, has been proposed and showed to provide an exponential speedup relative to classical algorithms.

**Principal component analysis**

In portfolio optimization, it is essential to have a global vision of interest rates paths. A standard tool for doing this is a method known as principal component analysis (PCA). The same tool can be applied to asset selection and performance forecasting, where it is essential to have an idea of what characteristics are the best to predict the future behavior of the asset or its associated interest rates.

In practice, PCA mathematically amounts to finding dominant eigenvalues and eigenvectors of a very large matrix. Normally, the cost is prohibitive and, for real-life data, where there could be millions of stocks, this cost is astronomical.

It was recently shown that a version of this algorithm, the quantum-PCA algorithm, could be run exponentially faster on a quantum processor. Specifically, this algorithm finds approximations to the principal components of a correlation matrix, with a computational cost exponentially lower than on a classical computer. This development should vastly broaden the spectrum of applicability of PCA, allowing us to estimate risk and maximize profit in situations that were not feasible using classical methods.

**Neural networks and associated models**

Neural networks have proved extremely successful at predicting markets and analyzing credit risk. The key to this success lies in their ability to tackle tasks which require intuitive judgment and to draw conclusions even from incomplete datasets. These properties have made neural networks essential for, e.g., financial time-series prediction.

While forecasting from already trained machine learning algorithms is usually extremely efficient, the training itself can be computationally expensive. This overhead could be significantly reduced by training the neural network using a quantum annealer, such as the D-Wave. Once trained, the algorithm may be run on any classical computer. There are some lines of activity in this field:

– In an early implementation a D-Wave quantum computer has been used in a Proof-of-Concept to train a specific type of neural network (a Boltzmann Machine

– Alternatively, the training process could be sped-up exponentially by using quantum PCA, which was described in the previous section.

Another possibility altogether is to design new, fully quantum neural network algorithms able to learn much more complex data patterns than those identifiable using a classical neural network. This is particularly interesting in the Hidden Markov models, which are commonly used for financial forecasting. Quantum Hidden Markov models may deliver better and faster forecasting.

**QUANTUM AMPLITUDE ESTIMATION AND MONTE CARLO**

The Monte Carlo method is a technique used to estimate a system’s properties stochastically, by statistically sampling realizations of the system. It is ubiquitous in science, with applications in physics, chemistry, engineering, and finance. Where it really shines is in dealing with extremely large or complex systems, which cannot be approached analytically or handled through other methods. In finance, the stochastic approach is typically used to simulate the effect of uncertainties affecting the financial object in question, which could be a stock, a portfolio, or an option. This makes Monte Carlo methods applicable to portfolio evaluation, personal finance planning, risk evaluation, and derivatives pricing.

The problem with Monte Carlo methods is that, if we want to obtain the most probable outcome of a wide distribution or obtain a result with a very small associated error, the required number of simulations can become gigantic. This is the case for stock market simulations, for instance, which are routinely day-long simulations. In this situation, obtaining a quantum speedup could make a notable difference.

An important ingredient toward a quantum accelerated Monte Carlo was recently suggested: Quantum Amplitude Estimation (QAE) algorithm. QAE can sample a probabilistic distribution quadratically faster than the next best classical method, allowing it to estimate expectation values extremely efficiently. In the following section, we describe two specific problems in finance which can be tackled using this method.

**Pricing of financial derivatives**

Financial derivatives contracts have a payoff that depends on the future price trajectory of some asset (which can have a stochastic nature). Brokers must know how to assign a fair price to the derivatives from the state of the market. This is the pricing problem. The classical approach to this problem is via simplified scenarios, such as the Black-Scholes-Merton model, and Monte Carlo sampling. Due to the growing number of derivative products, only Monte Carlo simulations are viable, but at a huge computational cost and lengthy execution times. A quantum-accelerated Monte Carlo algorithm was formulated which tackles this problem, providing a quadratic speedup.

**Risk analysis**

Financial institutions need to be able to accurately manage and compute risk. One way to mathematically quantify the risk is through the Value at Risk (VaR) function, which measures the distribution of losses of a portfolio. An equally important quantity is the Conditional Value at Risk, which is the expected loss that occurs when the VaR breakpoint has been reached. These concepts are of importance in portfolio optimization.

Typically, in quantitative finance, VaR and CVaR are estimated using Monte Carlo sampling of the relevant probability distribution. By applying the QAE algorithm, it is possible to determine VaR and CVaR with excellent accuracy and a quadratic speedup relative to classical methods. It has been tested for some small examples on the IBM Q Experience Quantum Computer.

**CONCLUSIONS AND PERSPECTIVES**

We have reviewed ways in which quantum computing could disrupt finance. As we have seen, this field is developing at a striking rate, partly due to experimental developments in quantum hardware, which are surpassing all expectations, and partly due to conceptual leaps, which promise unprecedented speedups for widely applicable algorithms. We believe that before long quantum computers will play a key role in quantitative finance.

We caution the reader that a few experimental breakthroughs will be necessary before we can construct a universal quantum processor capable of surpassing present-day supercomputers. We will need, for instance, to vastly increase the quality of qubits to implement some of the algorithms detailed here. It is possible, however, that faulty quantum computers will find interesting applications far before we achieve fault-tolerant quantum computing. We would expect it is in this area that the first real disruptions to finance will occur and urge researchers to investigate this fascinating direction of study.