ICAEW.com works better with JavaScript enabled.

Quantum computing

Gary Tinnams explores the world of quantum computing and how the efforts of IBM and Google might one day bear fruit. Enter the megacomputer.

The co-founder of Intel, Gordon Moore, extrapolated in 1965 that computing would dramatically increase in power and decrease in relative cost at an exponential pace. Known as Moore’s Law, it is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years. But all that could be about to change. 

Intel has suggested that silicon transistors can only keep shrinking for another five years, according to MIT Technology Review, at which point decreasing computer size and increasing processing power will no longer be possible in the traditional sense. A number of companies have been looking for a way to increase processing power and have poured significant resources into researching an alternative route: quantum computing. The potential impact on finance is far-reaching – as one quantum computing innovator has speculated, it could completely revolutionise risk modelling.

This is an extract from the Business & Management Magazine, Issue 256, July/August 2017. 

Find out more


Full article is available to Business and Management Faculty members and subscribers of Faculties Online. 


To read the complete article, join the Business and Management Faculty or subscribe to Faculties Online.