Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Hardware Hacking Upgrades Build Hardware

AMD Overclocks New Phenom II X4 To 7 GHz 288

CWmike writes "Advanced Micro Devices on Thursday introduced the latest member of its Phenom II X4 family of high-performance quad-core CPUs, which the No. 2 chip maker said it had run as fast as 7 GHz in extreme overclocking tests. Out of the box, the new X4 955 Black Edition, which is aimed at gamers and hobbyists, runs at 3.2 GHz, giving it similar performance to Intel's fastest desktop chips at lower cost, AMD says. The company was able to more than double the CPU's speed during its tests using extreme cooling technology that is not safe at home, said Brent Barry, an AMD product manager. The Web site Ripping.org notes that hobbyists with early access to the X4 955 chip have been able to clock it at up to 6.7 GHz. AMD said the chip was safe with fan cooling at up to 3.8 GHz."
This discussion has been archived. No new comments can be posted.

AMD Overclocks New Phenom II X4 To 7 GHz

Comments Filter:
  • by captnbmoore ( 911895 ) on Thursday April 23, 2009 @04:09PM (#27693351)
    So what am I supposed to do with the tank of liquid nitrogen I have in my back yard?
    • Re: (Score:2, Informative)

      by buckadude ( 926560 )
      Well, not exactly... FTA - "Key to achieving such speeds is the use of exotic cooling materials, primarily liquid nitrogen and liquid helium" But your point remains valid.
    • "Liquid helium, however, is much trickier -- and more dangerous -- to work with than liquid nitrogen and other more conventional coolants used by home overclockers, including water or air, said Davis."

      So you're free to use your N2 to blow up 2L bottles, shatter racquetballs and such.

    • by Ungrounded Lightning ( 62228 ) on Thursday April 23, 2009 @04:50PM (#27693993) Journal

      ... - for cooling or anything else - be sure to install an oxygen level alarm.

      A nitrogen leak will dilute the oxygen content of the air to the point that you'll pass out - then die - without noticing what is happening.

      Nitrogen is the bulk of normal air so it has no smell. Your breathing is controlled by the CO2 level, not the oxygen content, so you don't notice it when both are being diluted (and the dilution of the CO2 slows your breathing, exacerbating the problem with the oxygen level.)

      This made evolutionary sense because the O2 and CO2 level are normally related - CO2 goes up as oxygen is consumed - and the CO2 level starts from a low baseline and affects pH, making it FAR easier to detect. But it doesn't work very well when people start taking the atmosphere apart into its components and remixing them differently.

      • by Shadow of Eternity ( 795165 ) on Thursday April 23, 2009 @05:20PM (#27694395)

        Candle. If it goes out stick your head out a window.

      • I heard about this a while back, but it was for a proposal for using nitrogen gas for the death penalty.
      • by Twinbee ( 767046 )

        Is the 'passing out' phase instant or drawn out somewhat? In other words, would one be able to notice feeling as though they might pass out?

  • by mc1138 ( 718275 ) on Thursday April 23, 2009 @04:09PM (#27693353) Homepage
    With new devices on the horizon being capable of recharging via heathttp://hardware.slashdot.org/article.pl?sid=09/04/20/1915223, how long till we're able to capture the heat from processors and use them to cut power requirements for computers exponentially?
    • You mean in the same way, as wasted heat grows exponentially with higher frequencies? ^^

    • by GaratNW ( 978516 ) on Thursday April 23, 2009 @05:01PM (#27694151)

      Oh my god! You killed entropy!
      You bastards!

    • by Ungrounded Lightning ( 62228 ) on Thursday April 23, 2009 @05:03PM (#27694175) Journal

      ... how long till we're able to capture the heat from processors and use them to cut power requirements for computers exponentially?

      Look up the second law of thermodynamics.

      Power goes in on the "work" side of the Carnot Cycle and comes out on the "heat" side. You can salvage a small percentage by running the heat through a heat engine on the way to the heat sink - more if you let the chip get hotter. But not a lot.

      Further, the current technology can't stand being allowed to heat up - and its power consumption per unit of computation goes UP when it gets hotter. So even if you COULD put a bottleneck in the cooling (where you're normally spending more power to pump the heat away faster) to try to salvage some of the energy, you'll be running at a net loss.

      Now if somebody wants to use ceramic, high temperature metal alloys, and low work-function oxides to build integrated circuits based on vacuum-tube technology they might be able to get away with it. But electrons tend to be even larger and fuzzier in vacuum than in condensed matter so you might not be able to get your scale down to that of even current integrated circuits, limiting your speed due to signal propagation time.

      • That law explicitly applies only to closed systems. However, because there is no possible way to fully close any system that law does not apply in full to approximately%100 of all applications.

        In other words it is really useful as a rule of thumb for seeing if one's calculations are correct , but cannot be used as a proof.

    • In my household we obey the laws of thermodynamics!

    • Idunno, you'd have to ask the manufacturer [lhup.edu]

  • AMD has been going belly up for so long now it was easy to write them off for dead. Yet, I'm tempted to pick up their stock. Has to do better than my NBFAQ.

    I think there's still some brand loyalty in Opteron - I love mine and I still think an Opteron will be my next pick of CPU.

    And, the newest go around of Ubuntu Linux has some new drivers for ATI cards that should improve those matters.

    A 7ghz chip is a very healthy prize for AMD. I wouldn't expect them to advertise the power usage on such a thing, but hey, its engineerings, you can't have everything at once.

    I like AMD a lot, and I just hope they succeed. I know that Nehalem from Intel is a strong series of parts, and AMD has a lot of work to do, but the capital costs are so high in chipmaking that it is doubtful we would see another competitor to Intel emerge in a generation if AMD goes out.

    • Re: (Score:2, Insightful)

      by ausekilis ( 1513635 )
      I like AMD too, they've always been affordable, have pretty powerful chips, and amazing customer support. I picked up a Phenom 9550 around the time they hit the market, and either the mainboard or the CPU was flawed. I called them up, told them my trials, they sent me a new one within a week.

      Now, concerning the AMD/Intel battle that's going on. I'd have to say that Intel would be in for a bunch of monopoly lawsuits if AMD were to ever go belly up. It's really in there best interests to maintain competiti
      • by Penguinoflight ( 517245 ) on Thursday April 23, 2009 @04:33PM (#27693707) Journal

        I'd have to say that Intel would be in for a bunch of monopoly lawsuits if AMD were to ever go belly up. It's really in there best interests to maintain competition.

        That's really not true. Intel already maintains a monopoly-sized market share on CPUs, and they've been caught abusing it already (the intel compiler disables a lot of optimizations if code built on it doesn't detect an intel genuine cpu, for example.) It's still certainly in the best interest of the market, especially with child-company ATI being the only competitor to nVidia as well.

        • by Gerzel ( 240421 ) *

          I disagree.

          Legislators/Executive gov types will only act on monopolies when they are glaring and they can no longer afford not to act. Being the only chip maker will make things glaring enough that someone might just make a career out of that fact where-as not being the only one that same person would just be shouting hot air as he'd have to overcome the argument that they are a monopoly despite some apparent competition.

        • the intel compiler disables a lot of optimizations if code built on it doesn't detect a cpuid that it recognizes

          Fixed that for you.

          When that story broke years ago, Intel's compiler group didn't test with or target AMD chips or most others like cyrix. They also didn't use the bits that are meant for a processor to claim specific capabilities (like SSE, SSE2, MMX, etc). It was purely cpuid driven because Intel knew that some chips (like their own) had bugs in their implementations of SSE, etc so checking cpuid allowed them to work around those kinds of bugs.

          So, if you took code compiled back then and ran it on an AMD

        • I would disagree that the compiler disabling optimizations is anti-competitive.
          Simply put, if it is your CPU apply your optimizations, if it is not use the defaults. Frankly the CPU could be AMD, Via, OpenCore, Chinese knockoff, etc. not all of which support all the same optimizations. They can only control their CPU, that is what their compiler is market for, designed for, etc.

          What if they left everything on and code compiled by the tool chain didn't load on a non Intel CPU? Even worse what if it caused

        • by nxtw ( 866177 )

          they've been caught abusing it already (the intel compiler disables a lot of optimizations if code built on it doesn't detect an intel genuine cpu, for example.)

          But the Intel compiler is not at all a monopoly in the x86 compiler market, and Intel has not done anything to discourage the use of other compilers on their CPUs or force people to use the Intel compiler on Intel CPUs.

          Compare to Microsoft, which *did* have a monopoly in the PC OS market and *did* use this position to unfairly promote their own prod

    • AMD has no dividend. INTC's is a nice phat 3.6%.

      Intel takes care of its owners; AMD scoffs at them. To me as an investor, that's more important than who has a slightly faster chip at a given point in time. If AMD starts paying its shareholders their money, maybe it will get some buying interest.

      Cisco and NVidia, you guys listening? Your shareholders are why you exist. Give us our money.

      • by Ramze ( 640788 )
        Your post is a bit simplistic. I'm not sure where you got the idea that providing dividends is "taking care of the shareholders."

        Really, most MBAs would tell you that providing a dividend is the exact opposite.

        Company management generally assumes that people invest in company stock for the long term in the hopes that the stock price will rise. This is known as "increasing shareholder value" which is the optimum theoretical prime motivation for every business decision. Also, the idea is that sharehold

    • And then there's that Intel cache overrun SMM code promotion bug [slashdot.org] we talked about yesterday. Unless AMD has an equivalent problem Intel might be in trouble once the crackers get to exploiting it against Windows boxen.

  • Honestly though (Score:5, Insightful)

    by moniker127 ( 1290002 ) on Thursday April 23, 2009 @04:16PM (#27693459)
    Who cares what kind of rates you can get with a vat of liquid nitrogen on the damned thing? You're not going to be using that for anything practical.
    • Re: (Score:2, Insightful)

      by Icegryphon ( 715550 )
      It just shows that it has the potential, which means that without liquid nitrogen it still has alot fo potential just not as much.
      Even a small bump in overclocking can reduce huges jobs by hours. [i.e. video encoding]
      OVERCLOCKER FTW!!!
      • Re: (Score:3, Insightful)

        by DiegoBravo ( 324012 )

        I disagree. For example, the Pentium IV could be overclocked to 8Ghz but that fact was of little practical use, so Intel dumped the architecture at all.

        from http://en.wikipedia.org/wiki/Pentium_4 [wikipedia.org] :
        BEGIN EXCERPT --
        Overclockers did not break the 8 GHz barrier until the end of the Pentium 4 line on 3.0-3.6 GHz CPUs, which by then had a dwindling enthusiast user base.
        END --

        Honestly, GP was insightful.

    • Re:Honestly though (Score:5, Insightful)

      by Kjella ( 173770 ) on Thursday April 23, 2009 @04:46PM (#27693919) Homepage

      Who cares what kind of rates you can get with a vat of liquid nitrogen on the damned thing?

      It's usually more honest. Despite what their release schedule says, the CPU producers don't get even increases in speed of 100MHz. New architectures often makes for big bumps, but if they maxed it immediately they could only sell the big bump once and they don't want that. Sometimes they got headroom, sometimes they're pushing the last MHz out of the chip to keep a steady release of slightly faster processors for a healthy profit and steady cash stream. These tests push the chips to their real maximum, making it very tough to throw up a marketing smokescreen. If your chip isn't overclocking well or at all, you're in deep shit. This is basicly just showing off that the architecture is good and got room to grow, nothing more.

  • Oblig. (Score:2, Funny)

    by Vertana ( 1094987 )

    Yeah, but can it run Windows 7? //Burn the Karma baby!

  • by AK Dave ( 1459433 ) on Thursday April 23, 2009 @04:21PM (#27693531)
    With that much helium for coolant, all your audio will sound like Alvin and the Chipmunks
  • Great, until... (Score:5, Insightful)

    by Rayeth ( 1335201 ) on Thursday April 23, 2009 @04:22PM (#27693541)
    you realize that most of the applications you use are actually constrained by something other than your CPU speed (probably memory bandwidth or hard drive write speed).
    • Overclocking usually bumps the FSB speed as well as the multiplier, so memory goes up, too.

      • Re: (Score:3, Insightful)

        I think they probably locked the clocks on the memory though, as most good overclocking motherboards will let you do. I'm not sure whether there actually exists RAM that will let run stabley at twice its normal clock speed, since on AMD chips the RAM speed is directly proportional to the CPU clock speed. The same probably goes for the PCI-E buses, those probably had to be locked as well.
    • With home computers this is very true. Most server class machines have 10K+ rpm drives with a large cache on the drive and the controllers. Most home class computers do not have this.

      I upgraded my parents pc's hard drive. It is an old dell machine. RAM was maxed at 2GB, and the 40GB hard drive was 80% full (of pictures I keep on telling them to burn the pictures to CD/DVD who needs 24GB of pictures). I put in a 300GB drive. I get a call 3 days later asking what I did to the machine. I was thinking the repla

    • I noticed this on a system I put together a year ago - Intel Skulltrail 8x core machine (2 quad core xeons) - really really fast until it goes to disk and its just as slow as any PC.

  • by jmorris42 ( 1458 ) * <{jmorris} {at} {beau.org}> on Thursday April 23, 2009 @04:33PM (#27693709)

    To me the takeaway is just how little progress chipmakers are making.

    Compare to the 1990s. x86 processors started the decade with the 80486 @ 33MHz and ended with the Athlon @ 1GHz mark and was doing more per clock for even more improvement than pure clock ratings would indicate.. Now in the decade we are about to close out we have managed to push that to around 3.5GHz and by the end of '10 we might hit 4GHz and eight cores (for those willing to spend serious coin) but work per clock doesn't seem to have improved at all and if anything have even slid back a bit.

    RAM improvement have slowed down as well, probably because of Windows inability to get large deployment of 64bit editions limiting demand. The 1990s saw average ram go from 1-4MB to 64-128MB. It has only been recently that 2GB sticks went from exotic server stuff to mainstream.

    Speed also isn't getting faster as fast as capacity is growing. Compare how many seconds it would take a 1990 vintage 486 to write to every memory location vs a modern machine. Same goes for disk access. Hibernation on a modern laptop is pretty much a dead issue since the time to write the whole memory load to disc is unworkable.

    • by Lord Ender ( 156273 ) on Thursday April 23, 2009 @04:39PM (#27693817) Homepage

      GPUs are where the real action is. Look at video games ten years ago. Then look at Left 4 Dead on a GTX280. WOW.

      • Very few people actually need to crunch numbers - which is pretty much all high GHz chips are any good for.
        Looking back to old PCs I built, I'd choose the CPU first - then just bits around it to make the CPU work (the endless procession of beige plastic boxes I randomly bought to house these machines still litter attics of my family as they were cast off).
        Gaming is the only thing that needs power, and when building a gaming system the CPU requirement is "high enough for it not to be the bottleneck holding
      • Heh... It's funny you should mention Left 4 Dead. It's playable on an Athlon XP from 2003!

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      The mhz argument is lame. The difference between the top-end 486 of 1990 and a top-end 1ghz athlon is similar to the difference between that same top-end 1ghz athlon and a top-end Corei7 right now. That 1ghz athlon is one core; while a corei7 (to set aside the vast improvement of a single core) has 4 and soon 6.

      Mhz isn't everything, it does generate a LOT of heat; which is why Netburst didn't scale up to 10ghz well like Intel thought it would. I think you lack the understanding of how fast modern CPUs are

    • I personally am not doing anything with my computer that I didn't do 10 years ago. Play a few games, watch movies, listen to music, do some work. I could watch the same movies (encoded at the same quality), listen to the same music, and do the same work on my old machine that I do now. The only real driving force is games. Sure it's nice to be able to encode a movie in real time, but hasn't really given me any new abilities. And even most games that are out now don't require the top of the line machine
    • by i.of.the.storm ( 907783 ) on Thursday April 23, 2009 @04:50PM (#27693989) Homepage

      Are you serious? CPUs are doing a lot more per clock than they did in the past. In case you haven't noticed the sort of invisible 4ghz wall that we've been staring up at for the past 4 or 5 years, clock speeds actually have stayed pretty constant but raw performance as measured by benchmarks and such has been improving drastically - look at Core i7 benchmarks vs Core 2 Duo, or Phenom II vs Phenom vs Athlon X2. Really though, most people don't need more processing power than what a 2ghz dual core provides, if that, so it seems like things aren't improving, but they really have been making significant strides each year.

      I do agree on the hibernation bit though; it takes forever for my laptop's 3gigs to get written to disk. Now I just resort to sleep mode in Vista, which actually works, so it's not too big of an issue.

      • by jd ( 1658 )

        Yeah, but who trusts dhrystone benchmarks any more? I wanna see how the overclocked version does on LINPACK tests.

        • Ack! I'm being sandwiched by 4 digit UIDs! But seriously, I didn't even mention the worst case of bad performance per clock, the abomination that is the Pentium 4 Prescott.
    • Hmm, CPU's are no longer single core though, and that's a major paradigm shift. We're just now retooling to have an easier time writing multiple core code -- the hardware has pretty much evolved even faster than our software here. Even Windows 7 will still not be fully ready for this; then it's more interesting to look in the way of Mac OS X 10.6.

      • by nxtw ( 866177 )

        Hmm, CPU's are no longer single core though, and that's a major paradigm shift. We're just now retooling to have an easier time writing multiple core code -- the hardware has pretty much evolved even faster than our software here. Even Windows 7 will still not be fully ready for this; then it's more interesting to look in the way of Mac OS X 10.6.

        Windows (and OS X, and Linux) have been ready for multi-core systems for quite some time - since before multi-core x86 CPUs existed. SMP support is nothing new.

    • Sorry, but work per clock rose dramatically with the Core series of chips. And look at the improvement in the (GP)GPU sector.
      The best way to find out the work per clock, is to look at what big iron supercomputers use. Because there, nobody cares for the MHz, but everybody cares for the work per energy.

      But in general, you are right about there being a major slowdown.
      It mainly is, because we got closer, and now reached the physical limit for certain things. That's why we have multicore systems now.
      It will tak

    • but work per clock doesn't seem to have improved at all and if anything have even slid back a bit.

      What are you talking about? A processor sold today as mainstream beats the crap out of a processor that was mainstream 3 years ago at the same GHz, per core. Processor speeds have considerably improved even for single-threaded applications!

      Also, the ongoing miniaturization efforts are great, ensuring that processing power / watt goes up.

    • RAM improvement have slowed down as well, probably because of Windows inability to get large deployment of 64bit editions limiting demand.

      I'm going to have to disagree on the reason RAM capacities haven't increased with CPU speeds. I think the real reason is that the vast majority of PC users simply don't need to address more than 3GB memory, as they are generally only surfing or writing something up in Word or Excel. There is very little benefit for them to have lots of RAM sitting idle (other than allowing them to run more malware before noticing the impact).

    • That is a fairly ignorant analysis.

      If you look at overall system performance, not just Mhz, you will see that the same performance increases have been sort of maintained during the past 2 decades. Furthermore, under certain algorithms, things like GPUs obliterate that trend.

      If you were to plot performance per buck, you would see a dramatic ramp up, which during the 90s was no where near as dramatic. This is, at the end of this decade, relatively, you will be able to buy much more performance per dollar acro

    • I think you may have just contradicted yourself.

      1990-2000: 33MHz x 1 core to 1GHz x 1 core = 30x improvement

      2000-2010: 1GHz x 1 core to 4GHz x 8 cores = 32x improvement

      For linear tasks the new decade only brings 4x improvement, but for multitasking and multimedia, we are seeing 32x improvement.

  • While the summary read

    955 Black Edition

    I saw it to say

    955 Brick Edition

    Which I think is a CPU I would prefer to stay away from...

  • by SnarfQuest ( 469614 ) on Thursday April 23, 2009 @04:54PM (#27694071)

    Could someone help me? I just tried licking my processor, and now I can't get unstuck...

    • Re: (Score:3, Funny)

      by cjfs ( 1253208 )

      Could someone help me? I just tried licking my processor, and now I can't get unstuck...

      Sure, just turn on the computer and fire up SETI. It'll fix it right up :)

  • by GlobalColding ( 1239712 ) on Thursday April 23, 2009 @05:01PM (#27694159) Journal
    Oh Snap!
  • In not really wanting a faster machine? The last two decades, I've been eager to upgrade my machine, so I could get stuff done faster. Now... everything is fast enough. Compiles and even rendering take only moments. I can re-encode video faster than I play it. And I've been chased away from using PCs for games by all the bugs, patches, DRM, and expenses.

    Maybe once I upgrade from XP I'll desperately want a faster processor.
  • by Dunbal ( 464142 ) on Thursday April 23, 2009 @05:24PM (#27694447)

    From TFA:

    To cool a PC for 90 minutes requires 250 liters of liquid helium inside a aluminum vat the "size of a VW Beetle,"

          Once again the "technical" journalism community reminds us of that indispensable unit of volume measurement, the Volkswagen Beetle. As a purist, however, I must ask if that is in "new" Beetles or "old" Beetles.

  • The bigger issue (Score:5, Interesting)

    by MaXintosh ( 159753 ) on Thursday April 23, 2009 @05:32PM (#27694569)
    The bigger issue, here, is that cycles are getting cranked out faster than it's useful (or are getting to the point where an increase in speed is useless). Here's a little equation for you:
    (speed of light)*(1/(7 GHz))
    That solves to 4.282 cm. That's 1.6 in for people who don't speak metric. In the time that the processor does a single clock cycle, light in a vacuum can only go 4.282 cm. Electrons on a circuit can't propagate a voltage any further/faster than that.
  • Does anybody know the details of the i7 running at 8.22ghz on overclocking record database?

    http://www.ripping.org/index.php [ripping.org]

    And how fast have people gotten these things going using water-based systems?

    • Re: (Score:2, Informative)

      by Microlith ( 54737 )

      Click the link, it's a P4. His i7 topped out at 5.6GHz.

      If anything could go that high, it'd be the P4. That ridiculously long pipeline is what they were designed for.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...