April 3, 2020: Chicago Quantum:
Quick progress update. We are analyzing 28 highly liquid, US equities. We can run them classically through a 100% comprehensive 'brute force' analysis to find the best combination of assets to hold in a portfolio, in sections. We can then run them in different combinations to find even better (or worse) portfolio combinations. This is delivering insights on asset performance, both good and bad. We are running these on a dedicated server, and our personal computing devices, while we push the envelope on classical methods. Our test is how many assets we can run at one time. N assets creates 2^N-1 portfolios. In other words, 28 assets has 268M unique portfolio options (what we are working on classically now), and 30 assets would be ~ 1 billion unique portfolios. Two somewhat aged PCs each analyzed 28 assets in one run. One is working through 32 assets (which would be 4 billion unique portfolios). Our HP Z420 workstation analyzed over 550 million unique stock portfolios on its way to one billion (or more). We expect to hit one billion analyzed by tomorrow. That server has 48GB RAM, a 1TB disk drive, and two SSD drives. That means we calculate the Sharpe Ratio for every portfolio combination with standard, equal weights, of those 28 equities, to find the best portfolios for an investor like you or me. This is brute force analysis, the equivalent to looking for a needle in a haystack by picking up and moving every single piece of hay until we find the one with a needle stuck in it. We continue to believe that the exponentially growing complexity problem of portfolio optimization is a good candidate for quantum approaches. BTW, we would like to look at hundreds of stocks at once...which would be a brute force approach of 2^100s of unique portfolios. We are finding valuable insights in the US equity market data in terms of building higher expected returns with lower standard deviation of returns. In other words, to build a portfolio that we think delivers more profit with less risk. Today our team met and discussed our results running on a quantum annealing system. Our QUBO can point in the direction of our perfect, classical results, but is not quite right. In terms of how close, it picks some, but not all, of the assets in our best portfolios. Over the weekend we will update our QUBO, and re-think the classical approach again, to see if we can advance both efforts. We continue to work on the QUBO to run these as an optimization on a quantum annealing system. Future steps for our team (meaning maybe next week): 1) Integrate bonds and commodities into our analysis. Does it help generate better portfolios? 2) Look at 'classical, probabilistic' methods to reduce the number of unique portfolios to analyze while still arriving at the best (or almost best) answer. This could include using sampling for seeding a genetic algorithm. This is important if we want to look at all 240 assets in one run because we cannot afford a server that can execute bootstrap analysis on 2^240-1 unique portfolios. 3) Find a brute force, classical approach to analyzing hundreds of stocks...to make the problem even more challenging for a quantum computer. Contact us for more info...and watch for publications / videos where we start to explain our results.
0 Comments
Leave a Reply. |
Jeff CohenStrategic IT Management Consultant with a strong interest in Quantum Computing. Consulting for 29 years and this looks as interesting as cloud computing was in 2010. Archives
February 2021
|