Alex Khan, Clark Alexander Ph.D., and I completed our work on classical portfolio optimization. We can run classical jobs for a long time that optimize more than 32 assets at one time. We can run, brute force, 32 assets in a portfolio on an aged laptop, and can make progress towards completing 40 assets in one go. We stopped that run, which requires over 1 trillion portfolios analyzed, as our server was producing enough heat to keep our furnace redundant. We have developed a concept and associated workflow that allows us to analyze hundreds of assets and create superior portfolios.
We ran the Sharpe Ratio, and created our own 'proprietary' classical formulation which we call the Chicago Quantum Ratio. We achieve similar portfolio optimization results with cleaner math that should run better on a quantum computer.
Since April 10 we have been running different formulations of the problem (with 20 assets) as a QUBO on D-Wave Systems using solver DW_2000Q_5, which is a 2030 qubit system. We continue to run different formulations of the QUBO (or Binary Quadratic Model, BQM), to replicate the optimal portfolios as selected from using the Sharpe Ratio, and the Chicago Quantum Ratio.
- we started with one sample, random, portfolio and achieved acceptable results
- running a second sample, also random, that has significant negative correlations between stocks
- setting up a third random sample...
Our goal is to replicate or deliver equivalent results to our brute-force solvers, then...then increase the problem size that can be run on a quantum system.
As always, please reach out to us for more information, to work with us, or to join the team.
Clients are always welcome!
Thank you, Jeffrey Cohen, President US Advanced Computing Infrastructure, Inc., on behalf of our team.
April 3, 2020: Chicago Quantum:
Quick progress update. We are analyzing 28 highly liquid, US equities. We can run them classically through a 100% comprehensive 'brute force' analysis to find the best combination of assets to hold in a portfolio, in sections. We can then run them in different combinations to find even better (or worse) portfolio combinations. This is delivering insights on asset performance, both good and bad.
We are running these on a dedicated server, and our personal computing devices, while we push the envelope on classical methods. Our test is how many assets we can run at one time. N assets creates 2^N-1 portfolios. In other words, 28 assets has 268M unique portfolio options (what we are working on classically now), and 30 assets would be ~ 1 billion unique portfolios. Two somewhat aged PCs each analyzed 28 assets in one run. One is working through 32 assets (which would be 4 billion unique portfolios).
Our HP Z420 workstation analyzed over 550 million unique stock portfolios on its way to one billion (or more). We expect to hit one billion analyzed by tomorrow. That server has 48GB RAM, a 1TB disk drive, and two SSD drives. That means we calculate the Sharpe Ratio for every portfolio combination with standard, equal weights, of those 28 equities, to find the best portfolios for an investor like you or me. This is brute force analysis, the equivalent to looking for a needle in a haystack by picking up and moving every single piece of hay until we find the one with a needle stuck in it.
We continue to believe that the exponentially growing complexity problem of portfolio optimization is a good candidate for quantum approaches. BTW, we would like to look at hundreds of stocks at once...which would be a brute force approach of 2^100s of unique portfolios.
We are finding valuable insights in the US equity market data in terms of building higher expected returns with lower standard deviation of returns. In other words, to build a portfolio that we think delivers more profit with less risk.
Today our team met and discussed our results running on a quantum annealing system. Our QUBO can point in the direction of our perfect, classical results, but is not quite right. In terms of how close, it picks some, but not all, of the assets in our best portfolios. Over the weekend we will update our QUBO, and re-think the classical approach again, to see if we can advance both efforts.
We continue to work on the QUBO to run these as an optimization on a quantum annealing system.
Future steps for our team (meaning maybe next week):
1) Integrate bonds and commodities into our analysis. Does it help generate better portfolios?
2) Look at 'classical, probabilistic' methods to reduce the number of unique portfolios to analyze while still arriving at the best (or almost best) answer. This could include using sampling for seeding a genetic algorithm. This is important if we want to look at all 240 assets in one run because we cannot afford a server that can execute bootstrap analysis on 2^240-1 unique portfolios.
3) Find a brute force, classical approach to analyzing hundreds of stocks...to make the problem even more challenging for a quantum computer.
Contact us for more info...and watch for publications / videos where we start to explain our results.
Strategic IT Management Consultant with a strong interest in Quantum Computing. Consulting for 29 years and this looks as interesting as cloud computing was in 2010.