Jump to content

Intel reveals 80 core processor


Swad

Taking a look into the not-too-distant future, Intel is raising the consumer-processor-bar pretty high with its display of raw 80 core power today. The International Herald Tribune has the scoop:

SAN FRANCISCO: Intel will demonstrate an experimental computer chip Monday with 80 separate processing engines, or cores, that company executives said provided a model for commercial chips that would be used widely in standard desktop, laptop and server computers within five years.

 

While the chip is not compatible with Intel's current chips, the company said it had already begun design work on a commercial version that would essentially have dozens or even hundreds of Intel-compatible microprocessors laid out in a tiled pattern on a single chip.

 

Already, computer networking companies and the makers of PC graphics cards are moving to processor designs that have hundreds of computing engines. For example, Cisco Systems now uses a chip called Metro with 192 cores in its high-end network routers. In November, Nvidia introduced its most powerful graphics processor, the GeForce 8800, which has 128 cores.

 

The Teraflops chip, which consumes just 62 watts at teraflop speeds and is air-cooled, contains an internal data packet router in each processor tile. It can move data between tiles in as little as 1.25 nanoseconds, making it possible to transfer 80 billion bytes a second between the internal cores.

I just want to see how Intel will name the thing. Intel Core 92 Tera?


User Feedback

Recommended Comments



The average Joe isn't exactly computing the number of possible planets in Proxima Centauri using thousands of petabytes of data. The quantum computer is basically only for research facilities. AMD and Intel don't have anything to worry about as far as losing market share in their main consumer base goes.

 

That's not to say they don't feel pressure though. They do. But from eachother. It's a processor "arms race" basically. :P

Link to comment
Share on other sites

So first the Mhz race, now the core race...

Does Intel never learn?

 

It's tricky enough to make progams that fully use 4 cores, let alone 80.

i just feel that this is only part the way to go, not all the way.

 

AMD has got a much better idea with Fusion

Link to comment
Share on other sites

So first the Mhz race, now the core race...

Does Intel never learn?

 

It's tricky enough to make progams that fully use 4 cores, let alone 80.

i just feel that this is only part the way to go, not all the way.

 

AMD has got a much better idea with Fusion

 

Simply because 80 cores isn't of use to a typical consumer today, it hardly warrants any denounciation. There's a huge difference between what good for the consumer today and what's good for the industry.

 

I provided consultation for a CNC lathe design/mod to cut within .000002 or "2 hundreths of a tenth" ... is this necessary? NO! A product's practicality today is beside the point when doing R&D; it's paving the road for the future.

 

It's like the iPod, everyone starts designing products around your technology, it helps provide a stable market base.

 

just my :stretcher:

Edited by jgrimes80
Link to comment
Share on other sites

The Core Wars is a rehash of the Megahertz battle. Though amazingly cool this is what will happen. You need a new computer with 2-4 cores. Next year, you need a new computer with 4, 8, or 16 cores. Year following, 16, 32, or 64 cores. This will go on and on until they cannot fit anymore cores. Then they will want you to buy a quad processor computer with 256 cores on each processor. Hopefully by then, biocomputers that calculate using plant life will emerge.

 

By the way, technology moves exponentially not linearly. In our minds, we think technology changes linearly, but it does not work that way.

 

Go Venus Fly Trap CPU Go! :D

 

gt.

 

Biocomputers could be centuries into the future:

http://retoum12.wordpress.com/2007/01/21/h...cramble-an-egg/

http://www.centres.ex.ac.uk/egenis/events/...Amos-Egenis.pdf

Edited by goodtime
Link to comment
Share on other sites

i think Amd its more reality based than Intel.Because they always have looked into better performance with instructions an tweaks of its architecture.Also im agree with you Paranoid Marvin ,i think Amd has something more apealling with Fusion , again graphics is the bottleneck to break so this seems more important than 80.000 core cpu .

About Barcelona sounds promising, i hope that Intel get it at last and develop something in cooperation with nvidia.

Zealot

Link to comment
Share on other sites

This thing is actually more of a proof of concept silicon to demonstrate that the bus these "cores" are using to communicate has the bandwidth and low-latency needed to support this kindof setup. The cores themselves are fairly meager, but imagine having say 10 full x86 cores for processing, 65 specialized cores dedicated to graphics processing, and another 5 cores for processing physics. It looks feasible with the current setup.

Link to comment
Share on other sites

The average Joe isn't exactly computing the number of possible planets in Proxima Centauri using thousands of petabytes of data. The quantum computer is basically only for research facilities. AMD and Intel don't have anything to worry about as far as losing market share in their main consumer base goes.

 

That's not to say they don't feel pressure though. They do. But from eachother. It's a processor "arms race" basically. :(

 

I read somewhere that the 80 core processor isn't x86 either, so I thought that the quantum processor could be competing with it.

Link to comment
Share on other sites

I read somewhere that the 80 core processor isn't x86 either, so I thought that the quantum processor could be competing with it.

 

It can be anything it needs to be. The cores on this particular demo chip are "mini-cores" which aren't fully x86 I think.

Link to comment
Share on other sites

I would want to buy one of those, but it seems so expansive, a four core Xeon already costs $2,000. How much do you guy expect this thing to cost? $40,000. My estimation.

 

 

They won't put it to the market of consumers that high. It won't be more than $5,000 at most if it ever hits the consumer level.

Link to comment
Share on other sites

it does not have that many cores, but has 1 core with 128 unified shader engines.

 

Great.. the stuff isn't out yet and already they begin misleading the world.

 

As for this intel chip. If it's based on the ARM-technology they nipped from Digital (StrongARM/XScale) then it doesn't necesarry consume much power. And since they said it isn't x86-compatible, which for embedded or speciallised machinery, isn't a requirement it doesn't matter either.

 

We ALL know the fiasco of the Netburst architecture so this move to chung cores together to increase computing power is the logical step and excact opposite from drivng clockspeeds up.

 

As for the use of the thing. Well Vista is already stuffed with DRM (and probably other useless {censored}), and it's ridiculous de-/encoding engine for socalled "Premium content" requires a hefty share of computational power. So the not yet available multi-core, teraflop-power will already be wasted by simply playing a HD-DVD onto your DH-TV/monitor setup. Which means that al this technological increase will be reduced to nothing by the time the Multi-megacorporations have a go at it while we the consumers wont benefit at all (except for yet another nice windowdress-up perhaps)

 

Naah, I'm not impresed at all.

Link to comment
Share on other sites

The plant computer will take over the world one day. Oh {censored}, my wife's roses are attacking me. It's already begun. hehe

 

gt

Link to comment
Share on other sites



×
×
  • Create New...