Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Preview of Intel's Dual-Core Extreme Edition

Hemos posted more than 9 years ago | from the kicking-the-tires dept.

Intel 289

ThinSkin writes "Intel let ExtremeTech.com sneak behind the curtain of its anticipated Dual-Core Pentium Extreme Edition processor for a full performance preview with benchmarks. Bundled with essentially two Prescott cores on one die, the Extreme Edition 840 processor clocks at 3.2GHz and contains a beefed-up power management system to keep the CPUs running cool during use. Expect Intel's dual-core line to hit the streets sometime this quarter. No word on pricing yet." Update: 04/04 17:26 GMT by T : Timmus points out FiringSquad's preview, too, writing "The benchmark results are mixed, with a few applications taking advantage of the new CPU, and some that don't." And Kez writes in reference to this article to say: "Our article on HEXUS.net, covering the P4 EE in detail, states the price as £650 (that's what we're looking at in the UK anyway, not sure about the U.S.)."

Sorry! There are no comments related to the filter you selected.

so (-1)

Anonymous Coward | more than 9 years ago | (#12134556)

what

How well does it do... (5, Funny)

kwoo (641864) | more than 9 years ago | (#12134557)

On SlashMark? Namely, how many seconds does it take to compile the Linux kernel? :P

Re:How well does it do... (1)

ackthpt (218170) | more than 9 years ago | (#12134594)

On SlashMark? Namely, how many seconds does it take to compile the Linux kernel? :P

Early benchmarks should be taken with a rather large salt lick. While it does OK, it clearly should do better once the software catches up with it (the minute they discontinue it for the next advance.)

I'm sure it'll do a bang-up job with your email and word processing, though.

Re:How well does it do... (5, Funny)

ShaniaTwain (197446) | more than 9 years ago | (#12134884)

how many seconds does it take to compile the Linux kernel?

if you press the 'turbo' button it goes twice as fast.

Re:How well does it do... (1)

pg110404 (836120) | more than 9 years ago | (#12134899)

Namely, how many seconds does it take to compile the Linux kernel?

You mean without overheating and destroying itself?

Re:How well does it do... (1)

PhotoBoy (684898) | more than 9 years ago | (#12134977)

It's only running at 3.2Ghz too, with two cores it must be producing some serious heat if they've had to clock it back so far.

Re:How well does it do... (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#12134947)

It sucks, because Linux sucks. Nuff' said

Re:How well does it do... (5, Funny)

rubycodez (864176) | more than 9 years ago | (#12135001)

with its advanced predictive branching and speculative execution, the processor will have several kernels with the most commonly used options compiled for you 0.25 seconds before you finish typing "make "

fp (0, Offtopic)

Anonymous Coward | more than 9 years ago | (#12134558)

Nice, beucase I didn't think thier current Extreme Edition was expensive enough. What gamer has the budget for these chips?

Like, Extreme, to the, like, totally max! (5, Insightful)

ackthpt (218170) | more than 9 years ago | (#12134563)

I love superlatives like 'Extreme' in a product name. It's so funny to look at, years later. "Hey, remember this old clunker? It was ' EXTREME !'"
"Yeah, by today's standards it's EXTREMELY slow!"
"Only dual core, ha ha ha ha hah!"

I guess they can't very well call it 840i, as they've already used that for a chipset, but maybe Intel should stick to names ending with -ium and -on instead of something which timelessly proclaims some chunk of doped silicon as superior.

Next up from Intel, the Ultra-Spifftronic-Wowee-Zappo Triple Core, with extra schmaltz!

Re:Like, Extreme, to the, like, totally max! (5, Funny)

mikael (484) | more than 9 years ago | (#12134623)

Somehow, Extremium and Extremon don't seem to have the same rhyme. Next up from Intel, the Ultra-Spifftronic-Wowee-Zappo Triple Core, with extra schmaltz! The local ice-cream van used to sell those during the Summer holidays - you had to eat them immediately, otherwise they would melt before you got inside.

Re:Like, Extreme, to the, like, totally max! (5, Insightful)

Stevyn (691306) | more than 9 years ago | (#12134717)

Then they'll call it "ExtremeX!"

I feel bad for the engineers who come up with these designs which are then crapped on by their marketting department.

Re:Like, Extreme, to the, like, totally max! (3, Interesting)

ackthpt (218170) | more than 9 years ago | (#12134733)

Then they'll call it "ExtremeX!" I feel bad for the engineers who come up with these designs which are then crapped on by their marketting department.

Which probably has a lot to do with the success of the Dilbert strip.

This morning, on the way in to work, the BBC World Service had another feature on managment (flavor-of-the-day) trends. I suppose marketting does the same thing, but nobody has actually put their finger on it, yet.

Re:Like, Extreme, to the, like, totally max! (3, Insightful)

utlemming (654269) | more than 9 years ago | (#12134915)

Worse yet, how many of those people that can truly benefit from the power that the Extreme Edition can offer, don't because of stigmas on the name? I recently was told a story about a guy that had a job offer but refused becuase he didn't fit the culture of the company: apparently every workstation had the latest, greatest gadget from the fancy fadish mice, to modded computer cases with the flashing neon lights. While those things looked cool, he didn't feel that he would fit in with a company that spent money on the cool stuff as opposed to spending money on development. I have to say that I feel the same way. When I am in the market for computing power, I am not interested in the fadish stuff -- I am interested in the raw numbers and if the computer can do what I need it to do. With names like "Extreme" your marketing to the gamers and not nessarily to the programing professional. The marketing departments should at least market a simular chip with simular abilities as a "Developer Edition." But I guess people that would be interested in them are the guys buying the Xeons and the Opertons.

I don't feel bad at all (0, Troll)

Anonymous Coward | more than 9 years ago | (#12134941)

I feel bad for the engineers who come up with these designs which are then crapped on by their marketting department.

You do realize that starting pay for expereinced R&D designers is about $120,000 a year. I don't feel bad at all. They could call them "ILoveGayCocK" chips and I still woudln't feel bad for the guys who get to make huge amounts of money doign what they love.

Re:Like, Extreme, to the, like, totally max! (0)

Anonymous Coward | more than 9 years ago | (#12134855)

Oh what I'd give for a turtle/rabit switch!

Re:Like, Extreme, to the, like, totally max! (2, Funny)

gfody (514448) | more than 9 years ago | (#12134856)

well for the most part they use "EE" in place of Extreme Edition.. maybe later on they can give it a better definition like Extra Expensive

Re:Like, Extreme, to the, like, totally max! (1)

morgan_greywolf (835522) | more than 9 years ago | (#12134953)

The 'Pentium Pro' comes to mind. I remember thinking, "What will they call the next special Pentium chip? The Pentium Pro GOLD?!" Heh. At 200 MHZ, the Pentium Pro isn't so 'Pro' anymore. Now they STILL haven't learned their lesson, calling their next special CPU the 'EXTREME!" That's Intel marketing for ya... EXTREMEly dull.

Anyone for Bitchin' Duluxe? (1)

Cumstien (637803) | more than 9 years ago | (#12134994)

Reminds me of the turbo button on 386 machines. It was sort of like a "don't suck" option you could turn on.

Deluxe is probably my favorite word for greatness that invariably means crappiness. Extreme is so late 1990s.

Sweet! (4, Funny)

kmartshopper (836454) | more than 9 years ago | (#12134567)

... something else we can use to make breakfast with!

Re:Sweet! (1)

ackthpt (218170) | more than 9 years ago | (#12134617)

... something else we can use to make breakfast with!

What? I didn't see the part about frying eggs or bacon on it. Does it emit heat like the first P4's did (they ove the 1 Lb. heatsink, remember em?)

Re:Sweet! (0)

Anonymous Coward | more than 9 years ago | (#12135014)

ROFL!

Holy Cow... (4, Funny)

Robotron23 (832528) | more than 9 years ago | (#12134573)

We recently returned from a road trip to discover a very large box waiting for us.

If the processors that big how the heck will I fit it on my motherboard?!

Re:Holy Cow... (5, Funny)

pla (258480) | more than 9 years ago | (#12134688)

If the processors that big how the heck will I fit it on my motherboard?!

Well, the processor itself only takes a few square inches - The rest of the box held the liquid nitrogen cooling system needed to keep the thing slightly cooler than the surface of the sun.

Re:Holy Cow... (1)

Mastadex (576985) | more than 9 years ago | (#12134720)

We had the fattest guy in the company to jump on it. after that, it fit like a glove!

Re:Holy Cow... (2, Funny)

ackthpt (218170) | more than 9 years ago | (#12134767)

If the processors that big how the heck will I fit it on my motherboard?!

That was the heatsink. The processor and motherboard were in a small brown box being crushed beneath it (as dictated by Galactic Shipping Directive 4.07a(7ii)

Re:Holy Cow... (2, Funny)

pg110404 (836120) | more than 9 years ago | (#12134863)

If the processors that big how the heck will I fit it on my motherboard?!

Simple. You don't fit the processor on the motherboard, you fit the motherboard on the processor.

Just don't forget to reinforce the desk.

Cool?!? (5, Insightful)

Cruithne (658153) | more than 9 years ago | (#12134576)

Running cool during use? It seems to me they'll need the power management to keep it from melting itself, judging from the heat output of just one of those beasts...

Re:Cool?!? (2, Interesting)

pg110404 (836120) | more than 9 years ago | (#12134681)

It seems to me they'll need the power management to keep it from melting itself

Don't forget the 50 Gigawatt power supply!

The processor alone consumes (last I heard) about 100 watts and if it's essentially two processors in one, will require a really really good power supply. That means to use this proc, you'll instantly need 100 extra watts out of your power supply.

If they have to have power management to keep it from meltdown, just how much more computing CAN you get out of it anyway? To me the second core would be running at about 20% duty cycle to keep it from catching on fire.

On the plus side, they could always mod the case to throw off that heat like a space heater. Coffee warmer in the summer, foot warmer in the winter.

How about (4, Informative)

Adult film producer (866485) | more than 9 years ago | (#12134583)

we just call it what it is, a two-die module. This is not true dual core but two cores slapped into one chip package... Sure you'll only be using one socket but thats about the only different. Architectually, you will need to look at AMD's offerings for true dual-core.

Re:How about (2)

ackthpt (218170) | more than 9 years ago | (#12134660)

we just call it what it is, a two-die module. This is not true dual core but two cores slapped into one chip package... Sure you'll only be using one socket but thats about the only different. Architectually, you will need to look at AMD's offerings for true dual-core.

Shush! You're taking the glimmer off the chrome, just as Intel, in a slap-dash manner, try to recapture some sort of legitimacy after getting spanked by AMD, right after totally dissing 64 bits.

You hear a tinny voice say, "32 bits should be enough for anyone."

Re:How about (1)

gfody (514448) | more than 9 years ago | (#12134920)

Intel got spanked by AMD's on-chip memory controller. The cpus support 64bit, but they handed intel its ass in 32bit mode thanks to the lower latency.

It was quite a strategy show boating X64 while the memory controller silently kicked ass.

Re:How about (2, Funny)

MankyD (567984) | more than 9 years ago | (#12134675)

This is not true dual core but two cores slapped into one chip package...

Care to elaborate on the difference?

Re:How about (1)

tomstdenis (446163) | more than 9 years ago | (#12134750)

In the AMD world the cpus talk across the HT at like "really fast" and then they talk to the northbridge.

In the Intel world they all share the northbridge.

Now think about "cache coherancy"...

Tom

Re:How about (4, Informative)

hawkbug (94280) | more than 9 years ago | (#12134838)

I don't think so - AMD boards don't have a northbridge... the memory controller is on the CPU itself.

http://www.anandtech.com/memory/showdoc.aspx?i=200 6 [anandtech.com]

See the last paragraph

Re:How about (2, Interesting)

Dink Paisy (823325) | more than 9 years ago | (#12134893)

So what you are saying is that AMD CPUs have more overhead due to cache coherency traffic on the point-to-point CPU links, whereas Intel CPUs don't generate cache cache coherency traffic except on invalid misses, since they can snoop the shared memory bus? And perhaps you could clear up for me what the northbridge for a newer AMD CPU does. I thought the main function of the northbridge was the memory controller, which is included on die on newer AMD CPUs.

Re:How about (4, Interesting)

tomstdenis (446163) | more than 9 years ago | (#12134926)

No, you got it backwards. The AMD cpus [as I understand it] have DEDICATED pipes to the other cpus. They're 8/16 bits wide and run at [forget but think it goes upto 1.6Ghz].

So cpu 2 and cpu 3 could talk and not get in the way of cpu 1 and the memory bus. Yes, there is "northbridge" for memory but there still is a memory bus. The Intel cpus have no dedicated bus and ALL talk over the same bus.

Not having either combo of boxes I can't tell you which is faster but usually AMD is much faster than Intel just on the pure "not being a Ghz pusher".

Tom

Re:How about (1)

hawkbug (94280) | more than 9 years ago | (#12134970)

No, there is not a northbridge for the K8 core. Everything related to memory is on the chip itself.

Re:How about (-1)

tehcrazybob (850194) | more than 9 years ago | (#12134794)

True dual-core systems have two separate processors, in two separate sockets, cooled independently of each other. Dual systems are nice for number-crunching and multitasking because tasks can be assigned to whichever core is currently doing less work.

Intel's new dual-core places two cores on one chip. It's convenient because it offers some of the advantages of true dual systems, but I am sure they also had to make concessions because of heat and more limited connections (since both cores are connected through the same socket). I also doubt there will be much of a price advantage over a typical dual system, because any technology this new and unique will be extremely expensive.

Re:How about (0)

Anonymous Coward | more than 9 years ago | (#12134828)

You are confused. Please be quiet.

Re:How about (3, Informative)

Noehre (16438) | more than 9 years ago | (#12134845)

> True dual-core systems

Nobody has EVER used the term 'dual-core' to describe dual-processor SMP. Dual-core has always been in reference to two cores on one chip.

Re:How about (-1, Flamebait)

tehcrazybob (850194) | more than 9 years ago | (#12134894)

Nobody has EVER used the term 'dual-core' to describe dual-processor SMP.

Pretty sure I just did.

A processor is a processing core mounted to a chunk of silicon which contains all the pins to connect the core to the motherboard. Given this, a computer with two processors also contains two cores. Since the word 'dual' has long been accepted to mean two of something, dual-core is a perfectly accurate description of a computer with two cores. It's totally irrelevant whether they are mounted on one silicon wafer or two.

Re:How about (2, Funny)

MankyD (567984) | more than 9 years ago | (#12134956)

Now I'm confused. From your first post [slashdot.org] :
rue dual-core systems have two separate processors, in two separate sockets, cooled independently of each other.

and now [slashdot.org] :
It's totally irrelevant whether they are mounted on one silicon wafer or two.

I was under the impression that dual-core was two processors (two cores) mounted on one chip - i.e. one chip with two cores. Whereas what you referred to in your first post was called dual-processor, albeit also dual core.

Re:How about (0)

Anonymous Coward | more than 9 years ago | (#12134967)

That doesn't make it a "TRUE dual core" system as you called it. Analyzing the words "dual core" based on the definitions of "dual" and "core" ignores the fact that "dual core" has its own definition.

Re:How about (1)

Noehre (16438) | more than 9 years ago | (#12134971)

> Pretty sure I just did.

And I'm pretty sure you aren't person of interest concerning processor design. This would be akin to me saying that HTTP really means hyper terminal turbo popper. I might work literally, but everybody knows I'm full of shit.

The literal definition of the phrase is difference than its widely-accepted meaning. Is that so hard to understand?

Instead of having to repeatedly say "two processor cores on a single processor die," someone coined the term 'dual-core.'

The meaning has always been this way. Just because you decide that it means something else doesn't make what you say correct.

Re:How about (1)

gfody (514448) | more than 9 years ago | (#12134998)

dual core and smp are like apples and oranges. two cpus are two cpus.. the OS sees them and uses them as it does other resources. dual core the OS does not see, the cpu employs the two cores to execute more pipelines in parallel. a car with two engines is not the same as two cars.

Re:How about (0, Troll)

tehcrazybob (850194) | more than 9 years ago | (#12135050)

Look, I'm very sorry I hurt all your little feelings. I am fully aware of the difference between the two technologies, and I apologize for my mistake in my first post and my sarcasm in the second.

Here's my first post, rewritten properly:
True dual-processor systems have two separate processors, in two separate sockets, cooled independently of each other. Dual systems are nice for number-crunching and multitasking because tasks can be assigned to whichever core is currently doing less work.

Intel's new dual-core places two cores on one chip. It's convenient because it offers some of the advantages of true dual systems, but I am sure they also had to make concessions because of heat and more limited connections (since both cores are connected through the same socket). I also doubt there will be much of a price advantage over a typical dual system, because any technology this new and unique will be extremely expensive.


I would also like to say that I don't know anything about AMD's offering of dual-core, so I can't comment on why their way is better. I'm sure it is, because AMD's way is always better, but I don't actually have proof of that.

Re:How about (2, Funny)

mobiux (118006) | more than 9 years ago | (#12134683)

But then you would need to admit that AMD's technology is technically superior.

And I doubt if intel marketing would appreciate that very much.

He would then find himself cut off and unable to make these "preview" articles.

Re:How about (3, Funny)

Jeff DeMaagd (2015) | more than 9 years ago | (#12134819)

A die is a term for a discrete piece of silicon. My understanding is that both cores are on the same piece of silicon, even if they don't share anything other than power and FSB connections. I would say that it is a single die module.

Re:How about (2, Interesting)

jskelly (151002) | more than 9 years ago | (#12134966)

Isn't there also a dual-core PowerPC/G5 in the works? I think it hasn't been announced officially, but it seems to have been accidentally confirmed by IBM [theregister.co.uk] and by Apple [theregister.co.uk] as well.

There is also a new dual-core error correction (5, Funny)

Anonymous Coward | more than 9 years ago | (#12134584)

If one of the cores generates a floating-point error, the other core can be used to correct the problem by adding both errors together to derive a slightly larger error.

Ketchup (2, Interesting)

drivinghighway61 (812488) | more than 9 years ago | (#12134586)

Intel is just playing catch-up now to AMD. With AMD's 64-bit architecture being chosen by the market over Intel's shoddy architecture, Intel is ahead only in name-recognition. As the article says, AMD has been working on their dual-core offering for a year longer than Intel. AMD is a year ahead in development. Their offering is likely to be much more robust than Intel's with that extra year.

But, who knows? Intel seems to be shipping first. And we all know, Real Artists Ship.

Re:Ketchup (1)

chez69 (135760) | more than 9 years ago | (#12134627)

yeah, look at all those dual core chips AMD is selling!

Re:Ketchup (2, Informative)

hawkbug (94280) | more than 9 years ago | (#12134775)

You don't seem to understand... Intel isn't selling dual core chips either - they are selling chips with two normal P4 dies on them, which are now forced to share I/O bandwidth from a single socket. These dies are also very underclocked (3.2 GHZ) compared to the standard P4, which comes in at around 3.8 GHZ now. Another tidbit of info for you - the new dual core P4s won't be compatible with a majority of Intel boards on the market... not even bios updates can correct a lot of the existing boards out there, new chipsets will be required on new boards. Now let's talk about AMD's offering... First off, it's true dual core - basically a single die with two cores on it, hence the name dual core. The two cores use hyper transport to communicate with various system devices. These chips won't be much slower, if not faster even, than the current single core chips on the market. Now for the best part - anybody with an existing Socket 939 AMD based motherboard will be able to use one. Worst case, you'll have to download a bios update to enable it, but it will work. AMD designed the K8 core to be dual ready out of the box, so this whole thing about them having an extra year isn't exactly true - they've had much longer than that.

Re:Ketchup (1, Interesting)

Anonymous Coward | more than 9 years ago | (#12134866)

"this whole thing about them having an extra year isn't exactly true - they've had much longer than that."

Too true. I read an article on www.tomshardware.com the other day comparing Intel and AMD's dual core approaches and it said that AMD had always designed the Athlon to be dual core since 1999...they just never put the second core on yet.

Re:Ketchup (1)

chez69 (135760) | more than 9 years ago | (#12134878)


the original poster said

> AMD has been working on their dual-core offering for a year longer than Intel. AMD is a year ahead in development.

intel is shipping a product, AMD is not.

nost likely the intel was rushed and may suck (i'm not really that much of processor fanboy). to insist that they are following when they where the ones that shipped first is kind of retarded.

Re:Ketchup (2, Insightful)

hawkbug (94280) | more than 9 years ago | (#12134931)

Sigh... I'll say it again, Intel is NOT shipping true dual core chips. They slapped 2 dies onto one package. And if you understand manufacturing, it's much more expensive to do this, and Intel would not do this in large volume without charging a massive amount of cash for each chip. And by the way, when you say shipping, can you show where you can currently purchase one of these chips? I didn't think so. It's called a paper launch, and Intel, Nvidia, ATI, and AMD are all notorious for using them. Intel might make a few of these chips and provide them to Dell for the high end gaming segment, which Dell might sell 100 of these machines for PR. When Intel can put two cores on a single die, and can actually ship them and people like us can buy them from places like Newegg.com, then you can claim Intel as shipping dual core chips. I'm not a processor fan boy either - I'm also telling you AMD is not shipping chips either, and when they do, it won't count until we can actually purchase and use them. What I am saying is that AMD will be first out of the door to ship true dual core chips. But you know what? It doesn't matter who is first - it matters who makes the best chip for the least amount of cash. Then we'll see who succeeds and who doesn't. Paper launches don't count.

Re:Ketchup (0)

Anonymous Coward | more than 9 years ago | (#12135061)

Except that Intel is doing OK whereas AMD has been losing money on it's desktop CPUs for years.

AMD's flash buisness was the only thing keeping them afloat, and now that is flagging.

Of the two, it is FAR more likely that AMD will go out of the desktop CPU buisness before Intel does.

The quarterly statements of the two companies involved paint a much clearer picture of how they are doing rather than looking at future product announcements.

Ketchup on their face (3, Insightful)

Blitzenn (554788) | more than 9 years ago | (#12134640)

I think Intel's decision to leave out extensions developed by AMD are going to kill to processor fairly quickly. Granted they bought the rights to them from AMD, but their must be some royalty type deal here, because Intel is only including a handful of them. That will make their processor increasingly incompatable with the already accepted AMD architecture. Why is Intel so grudging to admit they are behind? They are going to kill themselves with that attitude. A couple more processor iterations and failures like this, and I expect Intel to make moves to get out of the desktop processor market altogether.

Re:Ketchup on their face (0)

Anonymous Coward | more than 9 years ago | (#12134752)

"A couple more processor iterations and failures like this, and I expect Intel to make moves to get out of the desktop processor market altogether."

God willing! Then we will no longer have to suffer from Dell's shitty computers.

Dell Computers (1)

Blitzenn (554788) | more than 9 years ago | (#12134900)

Some people might see Intel leaving the processor market as a huge jump in conclusions. It's not really. The desktop processor market has not been among Intel's more profitable centers as of late. Even with their still huge marketshare numbers, they aren't making much money for their efforts in that arena. The Itantium represented a huge loss to them in terms of the amount of R&D, marketing and manufacturing that went into the product. Don't get me wrong, the Itanium is a fabulous processor, it's just that nobody wants it. It actually leaps a generation of processors and people are not willing to throw out their proven strategies, software and hardware to go in that direction. It's not that they lost their touch in making a great processor, it's that they lost touch with what the market is demanding. It's terribly hard to sell a product no one wants. That's where they really are.

As far as dell goes, I think they are great for their niche. I would never own one, but I would recommend them to my family members who need the great dell support when they do stupid things and bust the machine. That's Dell's only strength in my eyes. I much prefer the AMD chips, simply for their price points, if nothing else. That's going to be the new problem for Intel if they don't wake up soon. They could always get away with charging more for their processors than and equivalent AMD because people percieved that they were simply better. With that perception failing that pricing structure will either have to descend to or below AMD's, or they will see gtreatly decreased market share over then next 12 to 24 months.

Re:Ketchup on their face (1)

Holi (250190) | more than 9 years ago | (#12134873)

Actually there is no royalty deal, AMD and INTEL have cross licensing agreements stemming for a lawsuit several years back.

Dear Intel, (4, Funny)

Triumph The Insult C (586706) | more than 9 years ago | (#12134605)

I think it's great that you are developing new products.

However, because of your poor form of not making documentation or firmware freely available, I will instead be sending my personal dollars, and (significantly larger) work budget, to AMD.

Meanwhile in the real world. (0)

Anonymous Coward | more than 9 years ago | (#12134706)

However, because of your poor form of not making documentation or firmware freely available, I will instead be sending my personal dollars, and (significantly larger) work budget, to AMD.

Dear Intel,

I think it's great that you are developing new products.

However, because you are lagging behind we are forced to wait for your next iteration of affordable business machine processors. Please alert us when they become available as we refuse to spend any funds on AMD processors as they do not have the name recognition you possess.

We are patiently waiting to replace of 5000 + machines on a 3 year time scale.

Sincerely,

IT purchasing for State of XX Dept of Labor

Extreme edition (4, Insightful)

thundercatslair (809424) | more than 9 years ago | (#12134613)

Why do intel marketers think that if they name it "extreme edition" it will sell more?

Re:Extreme edition (1)

MynockGuano (164259) | more than 9 years ago | (#12134642)

Tragically, because it probably will.

Re:Extreme edition (1)

dartboard (23261) | more than 9 years ago | (#12134648)

Because it will.

Re:Extreme edition (5, Funny)

drivinghighway61 (812488) | more than 9 years ago | (#12134650)

Extreme editions always sell more. Just look at the adult entertainment industry. Which would you rather buy?

Double Anal Penetration
EXTREME Double Anal Penetration

Chocolate Asian Anal Gangbangs
EXTREME Chocolate Asian Anal Gangbangs

American Heroes Bukkake
EXTREME American Heroes Bukkake

I think we can all agree that Intel is on the right track.

Re:Extreme edition (0)

Anonymous Coward | more than 9 years ago | (#12134747)

EXTREME American Heroes Bukkake

Is that the one where our boys in Iraq express their death fetish by masturbating over the corpses of their fallen comrades?

Re:Extreme edition (1)

hamburger lady (218108) | more than 9 years ago | (#12134880)

throw in some 'extreme tentacles' and you've got a sale!

Re:Extreme edition (1)

pg110404 (836120) | more than 9 years ago | (#12134984)

I think we can all agree that Intel is on the right track.

Does that mean I can surf for porn at extreme speeds?

I'd also need a dedicated OC3 directly to the internet backbone for that.

Extreme edition could also mean extreme smoke generating just before it stops working altogether edition. Spin doctoring at its finest.

Would I need the "Pro" version of XP? (5, Interesting)

Rude Turnip (49495) | more than 9 years ago | (#12134629)

If I wanted to build a Windows system for gaming, would I have to buy Windows XP Pro for multiprocessor support...or is this dual core configuration invisible to the OS, meaning I could get away with XP Home for $100 less.

Re:Would I need the "Pro" version of XP? (5, Informative)

DaHat (247651) | more than 9 years ago | (#12134766)

No, Microsoft has said several times that hyperthreaded CPU's, along with multi-core ones will only be considered a single unit by the OS. So with XP home and a dual core chip, you are fine, just as XP pro users are with a pair of dual core chips.

Re:Would I need the "Pro" version of XP? (3, Interesting)

Esion Modnar (632431) | more than 9 years ago | (#12134772)

You can get OEM versions of XP Pro for as little as $125. I'd buy Pro over Home, even if I had a single CPU. Too many times I have gone to do something on a Home box (which I was able to do all day long on Pro), only to find out, "What do you mean I can't do that?!?!"

It's just irritating.

Re:Would I need the "Pro" version of XP? (1)

MankyD (567984) | more than 9 years ago | (#12134788)

I could be wrong, but I don't believe there is a version of Windows compiled for dual processors.

Instead, you will see performance gains when running mutliple applications. With even one app, it will allow multiple threads (ones for the OS and ones for the App) to run simultaneously, giving you a boost. Running more than one app or running apps compiled with multi-threading in mind will show a performance boost no matter what Windows version you are running.

Re:Would I need the "Pro" version of XP? (0)

Anonymous Coward | more than 9 years ago | (#12134830)

Windows XP Pro has support for multiple processors. XP Home does not.

Re:Would I need the "Pro" version of XP? (0)

Anonymous Coward | more than 9 years ago | (#12134954)

That is possibly correct, but the applications running would determine the use of more than one CPU. Games, for example, are generally only designed for a single CPU. Business or networking products, on the other hand, are more adapt to dual and quad CPU systems, and more commanly, servers.

Re:Would I need the "Pro" version of XP? (1)

Rude Turnip (49495) | more than 9 years ago | (#12134955)

"I could be wrong, but I don't believe there is a version of Windows compiled for dual processors."

Microsoft put a limit on the number of processors that would be supported by XP Home: 1. If you want dual processor support (as well as other, advanced networking options) you need to shell out extra $ for XP Pro. Hence, the purpose of my parent post.

Re:Would I need the "Pro" version of XP? (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#12134804)

Dude, just get a ripped off Corporate copy of XP Pro for free like everyone else on /.

Re:Would I need the "Pro" version of XP? (0)

Anonymous Coward | more than 9 years ago | (#12135053)

You could save even more if you used Linux or *BSD. Then you could use as many processors as you want and if you're able to support it (or a friend is able) it's very low cost.

Dual P4 = Back to the old days? (0)

imstanny (722685) | more than 9 years ago | (#12134633)

I think the value of a dual processor is not shown well in these benchmarks since the dual cores are each at 1.6ghz. Since P4 is reknowned to be a bad performer at lower ghz, I think it's best we wait for AMD's solution to dual core to get a better idea of dual core performance.

Re:Dual P4 = Back to the old days? (3, Informative)

unts (754160) | more than 9 years ago | (#12134869)

Each core runs at 3.2Ghz, RTFA ;), or read this one:

http://www.hexus.net/content/reviews/review.php?dX JsX3Jldmlld19JRD0xMDg1 [hexus.net]

Going to be about £650 in the UK according to HEXUS.

Re:Dual P4 = Back to the old days? (1)

unts (754160) | more than 9 years ago | (#12135057)

Further to this comment, if as an example you use a game that doesn't support multithreading, you're essentially getting single core 3.2Ghz Prescott performance from a very expensive CPU.

How much it'll cost? ha! (2, Funny)

LordKazan (558383) | more than 9 years ago | (#12134647)

People are actually asking how much it's going to cost?

The Answer is simple

An arm, a leg and your left testicle* - it's Intel afterall

--------
*or ovary if you're a woman

Re:How much it'll cost? ha! (1)

macaulay805 (823467) | more than 9 years ago | (#12134713)

An arm, a leg and your left testicle* - it's Intel afterall

And your first bourn son. After all, thats how they recruit engineers!

Gamers won't be interested (5, Insightful)

LiENUS (207736) | more than 9 years ago | (#12134663)

It looks like gamers won't be all that interested in this offering. Even once games support mutli-threading, this wont end up boosting their framerate much. Instead this will raise the lower framerate and give them smoother gameplay. While this is a great improvement unfortunately most gamers seem only interested in their max fps and not the minimum. However for workstations this will be great, lower cost than dual procescors means graphics design companies and advertising agencies can get their job done quicker and more efficiently.

Re:Gamers won't be interested (0)

Anonymous Coward | more than 9 years ago | (#12134937)

'Tis a pity.

Gamers should care more about their minimum (usually the time it drops the most, is the time you most need it, heavy firefight etc)

It seems they care mostly about "OMFG!! 300FPS!!!!l33t! (while looking at a wall in 640x480 with it all on low)

Re:Gamers won't be interested (1)

LiENUS (207736) | more than 9 years ago | (#12134974)

Well right now gaming seems to be more of a new thing, I think we are suffering from the "fast and furious" syndrome, wherein we see people thinking ohh if I do this and this and this then I get x peak performance, nevermind that y minimum performance is worse than stock. Over time I think a second subculture will appear that will focus on the entire range.

Re:Gamers won't be interested (1)

johnw (3725) | more than 9 years ago | (#12135020)

Even once games support mutli-threading
So that's one thread for Mutli and one for Dick Dastardly is it?

Re:Gamers won't be interested (0)

Anonymous Coward | more than 9 years ago | (#12135075)

For my Graphics workstation i'd rather have quad opterons and a hefty gfx card :P

Uh, right.... (3, Insightful)

imroy (755) | more than 9 years ago | (#12134695)

...contains a beefed-up power management system to keep the CPUs running cool during use

So in other words... unless you have extreme cooling this thing will never run at full speed for long. Because when it does, it will quickly heat up and this power management will throttle the clock speed and core voltage. Apps may start up a little faster, but long-term consumers of CPU cycles (e.g media encoding, some games, etc) won't see much improvement. But I'm sure lots of clueless consumers will go for this new eXtreme CPU. Can't wait to see what bullshit analogy Intel will come up with for the TV ads...

MOD PARENT DOWN! (0)

Anonymous Coward | more than 9 years ago | (#12135006)

Insightful?

How?

This same lame argument was made before with single core speed-step processors and was proven to be completely wrong!

This is just more baseless, idiodic bashing of a product that the poster has zero experience or knowledge about.

Only on /. could a post like this be "insightful".

Buuuuut (2, Interesting)

skomes (868255) | more than 9 years ago | (#12134809)

Why do we have dual cores? Everybody's admitted they are going to be prohibitively expensive, so is it just for show? Let's see some AFFORDABLE dual cores before we start heralding them as the future of processors.

Dual core ? (4, Funny)

Pop69 (700500) | more than 9 years ago | (#12134820)

Does that mean I'll be able to fry two eggs at once ?

Re:Dual core ? (1)

pg110404 (836120) | more than 9 years ago | (#12135056)

Does that mean I'll be able to fry two eggs at once ?

Or you'll be able to burn a single egg twice as fast.

Just have to ask... (5, Funny)

IdJit (78604) | more than 9 years ago | (#12134896)

Does it have a hemi?

This is so exciting! (5, Funny)

alta (1263) | more than 9 years ago | (#12134924)

Now the spyware on all my users's machines will have a processor all to themselves. That means the users will have the second processor to run Word, excel, et al...

That means they'll leave me alone and quit bitching about slow machines for a while! Woohoo! Oh, and will help that winword.exe that keeps crashing and staying backgrounded. Woot!

(Yes, I know the spyware will take over both proc's. Let me dream)

Intel(VHS) vs AMD(Beta) (0)

Anonymous Coward | more than 9 years ago | (#12135000)

two of these (4 CPU's)
compared to two AMD (4 CPU's)
and then 8

one's gonna scale MUCH better than the other.

but who cares for desktop.

Intel is playing the VHS side of the VHS-Beta game
and I fear ubiquity and hype will win out over the better product.

1)Intel EMT-64T vs Athlon 64
2)Intel DC (which BTW is not 2 x EMT-64) vs Opteron DC which is 2 x 64bit!

But the bulk of the market doesn't give a shit as long as it shines, uncompresses porn, plays video games, and runs a web browser

Long term solution? (2, Interesting)

Jugalator (259273) | more than 9 years ago | (#12135041)

Excuse me if this sounds unusually stupid at Slashdot, but they will in other words release 3.2 GHz dual core models initially? Won't they then have developed a new technology just to hit problematic clock frequency spoken of at ~4 GHz almost immediately? I was always thinking of something like two 1.6 GHz cores possibly with some tricks to achieve similar speeds as a current 3.2 GHz P4... Am I missing something here or is this just an unusually short term solution?

Apple n Oranges (2, Interesting)

zioncity (862007) | more than 9 years ago | (#12135076)

I wonder how it will compare to a dual core G5 chip from Apple.... whenever they get it out, which with all this dual core news from Intel, I would think it would be soon.

WWDC perhaps?
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?