Google Says Quantum Computer Beat 10,000-Year Task in Minutes
This article by Amy Thomson for Bloomberg may be of interest to subscribers. Here is a section:
Alphabet Inc.’s Google said it’s built a computer that’s reached “quantum supremacy,” performing a computation in 200 seconds that would take the fastest supercomputers about 10,000 years.
The results of Google’s tests, which were conducted using a quantum chip it developed in-house, were published Wednesday in the scientific journal Nature.
“This achievement is the result of years of research and the dedication of many people,” Google engineering director Hartmut Neven said in a blogpost. “It’s also the beginning of a new journey: figuring out how to put this technology to work. We’re working with the research community and have open-sourced tools to enable others to work alongside us to identify new applications.”
The question is not whether Google has achieved quantum supremacy or whether IBM will get there first. Rather the point is quantum mechanics has gone from philosophy to practicality in less than a century. Consider that the Greeks hypothesised the existence of the atom thousands of years ago but the nuclear age did not start until about 75 years ago. This is a clear example of the exponential pace of technological innovation.
One of the biggest issues with a data driven economy is the vast quantity of information being created every day. Whether that is in the social, physical, climate, genomic, geophysical or other fields the challenge is to analyse the data for useful action points to build a strategy or business on. Quantum computers’ focus on multiple possible world theory and Leibniz’ theory of the best of all possible worlds means they are ideally positioned to provide line of best fit through massive quantities of data which conventional computers have difficulty analysing.
The simultaneous additional evolution of chaos theory and algorithms that can predict outcomes where no organised system has even been observable before is an equally important development. This pretty much guarantees big data and machine learning will be the big investment themes of the next decade.
The challenge right now is the majority of the companies purporting to offer that kind of market exposure today are behind the curve in terms of the technological evolution. That suggests some reordering of the constituents of robotics/automation and AI ETFs is inevitable.
The UK listed but Dollar denominated L&G ROBO Global Robotics and Automation UCITS ETF (ROBO LN) lost uptrend consistency in January 2018 and shares a similar pattern with the US listed Robo Global Robotics & Automation Index ETF. It rebounded in early 2019 and has been ranging for much of the last six months. It is now testing the sequence of lower rally highs and a sustained move above $!6 would break the downtrend.
Alphabet/Google is testing the upper side of its range and a sustained move below $1000 would be required to question medium-term scope for continued upside.
Meanwhile, I’d like to thank a subscriber for this note from George Friedman which may be described as cautionary. He is not the first to comment all good things come to an end but we need to remember the market is forward looking and prices discount future potential, not the past.