Wednesday, 14 October 2009

Computers will stop getting faster

It seems that scientists have found an absolute speed limit for computers:

A pair of physicists has shown that computers have a speed limit as unbreakable as the speed of light. If processors continue to accelerate as they have in the past, we'll hit the wall of faster processing in less than a century.

Intel co-founder Gordon Moore predicted 40 years ago that manufacturers could double computing speed every two years or so by cramming ever-tinier transistors on a chip. His prediction became known as Moore's Law, and it has held true throughout the evolution of computers -- the fastest processor today beats out a ten-year-old competitor by a factor of about 30.

If components are to continue shrinking, physicists must eventually code bits of information onto ever smaller particles. Smaller means faster in the microelectronic world, but physicists Lev Levitin and Tommaso Toffoli at Boston University in Massachusetts, have slapped a speed limit on computing, no matter how small the components get.

"If we believe in Moore's laW ... then it would take about 75 to 80 years to achieve this quantum limit," Levitin said.


And it's quite possible that it will happen in my lifetime, too:

Scott Aaronson, an assistant professor of electrical engineering and computer science at the Massachusetts Institute of Technology in Cambridge, thought Levitin's estimate of 75 years extremely optimistic.

Moore's Law, he said, probably won't hold for more than 20 years.

I find it amazing that human ingenuity might take the electronic computer from its birth* in 1943 to its ultimate limit in less than a century.

And that's why I hate prodnoses and Luddites who want to curb human ingenuity. Don't bitch about the carbon emissions of the Saturn V rocket, instead, let man reach for the stars!

*I regard the code-breaking Colossus as the first real electronic computer: binary, programmable and capable of conditional processing.

21 comments:

Von Spreuth. said...

"Computers will stop getting faster".

Good. It MAY give me chance to fucking catch up. Every time I buy a "new" one it is fucking "dead" three weeks later.

Tom Paine said...

My daughters had an interesting insight about my (sometimes quite negative) blogging. They said that I am not a pessimist, but a highly-frustrated idealist. I see from your final paragraph that you are too! Most of mankind is still grubbing in the mud, but we are asking them to reach for the stars.

Weston Bay said...

This idea that computer processing speed will eventually reach a fundamental limit has been around for a few years now. I'm no physicist but I think it's something to do with quantum thermodynamics. If there's a brainiac just passing by please enlighten me this one.

We won't need to chuck away our pc's/laptops anytime soon though and advancements in superconductivity continue apace which will also increase speeds dramatically- at least I bloody hope so!

Interesting post Obo.

Roue le Jour said...

Keep this up, Obo, and I'm going to have move you from my "Ranters" folder to my "Technology" folder.

Tommy Flowers, what a man, eh? My hero.

Steve Antony Williams said...

And the ironic thing is when you get your new "fast as possible" PC it STILL won't run Windows Vista smoothly ahahahahaha

Jim Baxter said...

'let man reach for the stars!'

Exactly. If there is to be any point to the sordidity that passes for human existence, lovely ladies excepted, it is that.

And - the Saturn V only generated water didn't it? Making it might have been a different matter, I suppose.

JuliaM said...

"Good. It MAY give me chance to fucking catch up. Every time I buy a "new" one it is fucking "dead" three weeks later."

Only if you want to play games or do very heavy-duty graphics or processing, surely?

For browsing the net and running basic software, most bargain-basement PCs will do the job.

Von Spreuth. said...

JuliaM said...

"Good. It MAY give me chance to fucking catch up. Every time I buy a "new" one it is fucking "dead" three weeks later."

Only if you want to play games or do very heavy-duty graphics or processing, surely?

For browsing the net and running basic software, most bargain-basement PCs will do the job.


Perfect example within the hour. Just got a new digital camera. Nowt special, just a €200 cheapy. But my fucking computer will not take the driver soft ware., and as the whole bastarding thing is formatted for that fucking Vistas shite, the thing is useless.

Umbongo said...

This kind of prediction is total - and unscientific - junk. Nobody knows what discoveries concerning the nature of the universe will be made in the future so no-one can predict the limit to computer speeds. The most which can be said is a highly qualified prediction that, given current technology and current scientific knowledge, it's probable that there is a limit to computer speed but that's as far as you can go.

Ryan said...

By then we will probably have quantum computing, which is a non deterministic mindfuck and I will hopefully die before I need to understand it.

We have always had to deal with a hard practical limit on the speed of computation.

Diplodocus Rex said...

Moore's Law has already been broken. Since Intel started producing Duo and Quad core processors all they did was stack the Pentiums one on top of each other in the same package. The geometry did not get smaller as had happened in previous generations.

Roue le Jour said...

Ryan, I wouldn't worry about quantum computing, it's just a scam to pay physicists and mathematicians mortgages. Oh, and generate lots of total bollocks articles that start off "You know how bits can be zero or one? Well qubits can be both! At the same time! Isn't that amazing!"

They needs to entangle about a thousand of the bloody things to do anything useful. Currently they're up to, oohh, I forget, sixteen or twenty. And adding new bits at the rate of about two a year. Set your alarm for 2500, that should do it.

Von Spreuth. said...

Being a techy dinosaur, I may be missing something here, but is the speed of light not a debilitating factor in increasing speeds beyond a certain range?

Sperm Lewis said...

http://news.bbc.co.uk/1/hi/england/2541761.stm

'Each of the [sheep] has a word from a poem written on their backs and as they wander about the words take a new poetic form each time they come to rest.

But the exercise is not just an attempt to create living poems, it is also, according to the poet, an exercise in quantum mechanics...

A spokesman for Northern Arts called the scheme "an exciting fusion of poetry and quantum physics". [and wool?]

One of the poems created by the sheep reads:

Warm drift, graze gentle, White below the sky, Soft sheep, mirrors, Snow clouds.'

We put this to the test in Rhondda Cynon Taff, using snippets from this very blog. The result:

'F Cockmonger, FF Firewall, One-eyed F mentalist, F Salmonella Kerry McCarthy'.

Shug Niggurath said...

As long as I get to stream my porn in real time and type my letters to the editor without the screen having to catch up I couldn't give a fuck!

Mitch said...

Deep thought from the Hitch hikers guide couldn't run that malware called vista smoothly.

microdave said...

This isn't new - some 10 years ago I attended a talk by some boffins from BT's Martelsham laboratories, and they were talking about the speed of light being a limiting factor then. They claimed to be working on ways round it, but wouldn't give any details....

@ Von Spreuth - be thankful, you are probably better off without it. My father recently bought a Samsung 'phone, and the supplied software package was over 165 Mb, virtually all of it un-needed for our purposes. I just bought a £4.99 micro SD card adaptor instead!

Von Spreuth said...

Thanks Mitch.

Will do the same I think.

Von Spreuth. said...

Sorry Microwave Dave I shoiuld have been thanking.

HAY, It is early, and I have not had my coffee yet.

Clive said...

"Getting round the speed of light":

With a 3GHz CPU, light travels 10cm per tick. SATA-2 is also clocked at 3GHz (one bit is sent down the wire each tick), and the cables are longer than 10cm. You just have several bits of data inside the cable at once.

The fundamental limit described in the article is based on Heisenberg's Uncertainty Principle, whose basic idea is this:

To observe an electron, you must bounce a photon of light off it. If you use low-energy photons, they have a long wavelength, so you can't measure the position of the electron very accurately. If you use a high-energy photon, it deflects the electron to a greater extent, interfering with any attempt to measure its momentum.

You can't (according to H.U.P.) accurately measure both the position and the momentum. The more accurate you measure one value, the more you mess up your measurement of the other one, so the less accurate it can possibly be.


This is a lot more fundamental than the idea of phoning someone on a manned mission to Saturn, and speaking your entire message before you hear his first "hello".

By the way, the theoretically perfect quantum computer will use just one atom for the equivalent of each present-day transistor. You'll be able to fit an awful lot of them on the head of a pin. Onwards and upwards!

Clive said...

"Aaronson called it beautiful that such a limit exists."

Beauty is an important part of modern theoretical physics, when you don't really know what you're doing.

For example:

How far is it from London to Holbeach?

(a) 100 miles
(b) 100 kilometres
(c) 100 yards
(d) 100 acres


(c) is suspiciously small: it would only make sense if Holbeach is a pub. But you can rule out (d) right away, because distances are not measured in units of area. If you can't get hold of a map, that's the best you can do.

The current theories of quantum gravity are a lot like this at the moment.