A pair of physicists has shown that computers have a speed limit as unbreakable as the speed of light. If processors continue to accelerate as they have in the past, we'll hit the wall of faster processing in less than a century.
Intel co-founder Gordon Moore predicted 40 years ago that manufacturers could double computing speed every two years or so by cramming ever-tinier transistors on a chip. His prediction became known as Moore's Law, and it has held true throughout the evolution of computers -- the fastest processor today beats out a ten-year-old competitor by a factor of about 30.
If components are to continue shrinking, physicists must eventually code bits of information onto ever smaller particles. Smaller means faster in the microelectronic world, but physicists Lev Levitin and Tommaso Toffoli at Boston University in Massachusetts, have slapped a speed limit on computing, no matter how small the components get.
"If we believe in Moore's laW ... then it would take about 75 to 80 years to achieve this quantum limit," Levitin said.
And it's quite possible that it will happen in my lifetime, too:
Scott Aaronson, an assistant professor of electrical engineering and computer science at the Massachusetts Institute of Technology in Cambridge, thought Levitin's estimate of 75 years extremely optimistic.
Moore's Law, he said, probably won't hold for more than 20 years.
I find it amazing that human ingenuity might take the electronic computer from its birth* in 1943 to its ultimate limit in less than a century.
And that's why I hate prodnoses and Luddites who want to curb human ingenuity. Don't bitch about the carbon emissions of the Saturn V rocket, instead, let man reach for the stars!
*I regard the code-breaking Colossus as the first real electronic computer: binary, programmable and capable of conditional processing.