Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Portables Hardware

NVIDIA Announces GeForce GTX 560M and GT 520MX Mobile GPUs 62

MojoKid writes "NVIDIA just took the wraps off of a couple of new mobile GPUs at Computex and announced a slew of notebooks designs that will feature the new chips. The new GeForce GTX 560M and GT 520M will be arriving very soon, in notebooks from Asus, Alienware, Clevo, Toshiba, MSI, Samsung and others. The GeForce GT 520MX is an entry level DirectX 11 GPU designed for thin, light, highly mobile platforms. It sports 48 CUDA cores with a 900MHz graphics clock, 1800MHz shader clock, and 900MHz memory clock. Decidedly more powerful, the GeForce GTX 560M is outfitted with 192 CUDA cores and clocks in at 775MHz, with 1559MHz shaders, and 1250MHz for GDDR5 memory."
This discussion has been archived. No new comments can be posted.

NVIDIA Announces GeForce GTX 560M and GT 520MX Mobile GPUs

Comments Filter:
  • by Anonymous Coward

    How does this compare to Intel and AMD's graphics integrated CPUs?

    • Re: (Score:1, Interesting)

      by Anonymous Coward

      How does this compare to Intel and AMD's graphics integrated CPUs?

      The 560M is way higher end than integrated CPU's. It's on-par with desktop performance. Without benchmarks yet, I'd guess that it's probably similar to the GTX 560 non-mobile version.

    • Comment removed based on user account deletion
      • I think NVidia should do an x64 chip. The patents on x86 have mostly expired now. AMD have said they will license the x64 extension to anyone - they've already done it to Transmeta and Via.

        An Atom class x64 chip would mean they could do a combined CPU/GPU.

        The other option would be to buy Via who've already got an x86 licence. Or even just team up with them to put NVidia GPUs on the same die as Via CPUs. Which would be interesting combination actually. You could scale the performance from Intel Atom to AMD B

      • by Nursie ( 632944 )

        CUDA and OpenCL are a niche, but one that's increasing in importance for scientific computing. We're already seeing nVidia components in the top 10 supercomputers list.

        However I doubt very much they're going anywhere in the next five years. The capabilities of the GPU are on the cusp of being exploited in ordinary (i.e. not super) computing and can only grow.

        They're attacking the high end market with Tesla/Fermi, the mainstream and mobile sectors are well catered to with cutting edge GPUs, ION addresses net

  • But will it run DNF?
    • by Anonymous Coward
      I should hope so, DNF is running on unreal engine 3, and my 1gb gtx 260 mobile can run all those games at 1080p at 60fps. In fact that card can play all except metro 2039 at full resolution with all of the pretty effects. I could be wrong, but any computer built in the last 5 years with a dedicated video card should be able to run any game that is also on a console. The most you would have to do with an nvidia laptop/mobile dedicated gpu is turn down anti-aliasing and shut off vsync though. I did notice som
  • by pinkushun ( 1467193 ) * on Monday May 30, 2011 @07:22AM (#36285422) Journal

    "Intel does provide development drivers for Intel graphics to the open source community."

    +1 :D

    http://www.intel.com/support/graphics/sb/cs-010512.htm?wapkw=(linux) [intel.com]

    • What, exactly, does this have to with the article? AMD/ATI also provides drivers for the open source community.

      The thing is, of the three big graphics vendors, only NVIDIA supports reasonably complete OpenGL support, which makes their cards non-starters for us CAD users. I run SketchUp on Wine [blogspot.com]

      • I don't know, I can't begin to imagine what open source drivers for new video cards could possibly have to do with an article about those new video cards on a website full of open source users. Very mysterious.

        On a tangential vein, my desktop machine is NVIDIA but my laptop GPU is ATI, and CUDA on the ATI is broken and was just a waste of money because AMD/ATI's website just sends me to the laptop manufacturer's website to supposedly get the drivers, but the drivers on HP's website that they point me to, ju

        • I can't tell if you're being sarcastic or not, but CUDA is a propietary standard that only works on Nvidia GPUs.
          • Sorry, I meant OpenCL, I was typing in a hurry and momentarily confused. The card is a Mobility FireGL V5700. Most recently I was trying to get rpcminer-opencl working for bitcoin mining, to no avail. The AMD site just tells me I must download the drivers from HP. I've downloaded and installed the drivers from HP's site and they don't work.

            • I mean, they do work in the sense that my display card is working, but GPU OpenCL apps do not work.

            • What's your laptop's model name? Also, what OS are you running on that machine? The FireGL V5700 being based on the HD 3650M, it could be possible to mod the drivers for them to work on your computer.
            • That's funny. I have poclbm running on my Mobility Radeon 5830, and both my desktop's Radeon 6950's (unlocked to 6970's and overclocked).

              Perhaps you need to install the OpenCL dev kit or something? Because if the consumer level parts can run the miners, the pro level cards should have no problem. Try downloading the drivers from AMD and not HP: http://support.amd.com/us/gpudownload/windows/Pages/radeonmob_win7-64.aspx [amd.com]

              I don't know for sure, though.

              • Hmm, thanks, have tried installing both the Stream SDK and the AMD driver, still doesn't work. GPU caps viewer recognizes the GPU but says I have no OpenCL GPU devices.

            • by Nursie ( 632944 )

              OpenCL on nVidia seems broken too. At least under linux.

              I can get pure OpenCL stuff working, but the moment I try to use the CL/GL Interop stuff it just stops working, this despite all the right capabilities on the card.

              Makes it less useful for me as I wanted to use CL to draw GL textures. Maybe I should try CUDA instead, they're supposed to be similar.

      • by Yvanhoe ( 564877 )
        No, they provide drivers for linux, they don't provide open source drivers, which only Intel does : http://intellinuxgraphics.org/license.html [intellinuxgraphics.org]

        For that, they deserve all the advertisement they can get from the community.
    • ATI, Nvidia, and Intel provide open source drivers for Linux. However, they all don't provide the same amount of support. For example, Intel and ATI support for VAAPI is not as strong as NVidia [mythtv.org].
  • Is it guaranteed to fry my laptop like the last mobile NVidia chipset I bought? (140NVS)

    • I've just had a quick look at the nVidia GPU comparison page over at wikipedia, and the NVS 140M sports the infamous G86M chip, which was known to be faulty. You remember all the flak nVidia got due to the 8400M/8600M fiasco, don't you?
      • Of course I do, some of us was unfortunate enough to have that piece of shit hardwired into our laptops. Being a thinkpad user I wasn't even in the category of users who was offered the insulting replacement laptop.

        I think NVidia needs to be reminded of this for a long long time.

        • by rhook ( 943951 )

          No, you were in a category of users who were offered to have the mainboard replaced, for free, even if your warranty had expired. Iirc Lenovo is still repairing systems that had one of these defective chips, free of charge.

          • No, the program officially stopped a month before my laptop died, and even though it was extended the extension stopped the week before.

      • Ah, still having that problem are they? Every G92 I can lay my eyes on has died from that generation too in every friend's machine that's had one. My own 2007 MBP had the chip fail too, luckily Apple honoured their fit-for-purpose recall and replaced it for free. Glad I swapped to ATI for the desktop after my G80 GTX started to breath it's last breaths last year, though I think that's having memory death rather than the chip and unlike all the G92s I've seen, still works (barely).
      • I've just had a quick look at the nVidia GPU comparison page over at wikipedia, and the NVS 140M sports the infamous G86M chip, which was known to be faulty. You remember all the flak nVidia got due to the 8400M/8600M fiasco, don't you?

        8400M/8600M fiasco [wikipedia.org]:
        "Some chips of the GeForce 8 series (concretely those from the G84 and G86 series) may suffer from an overheating problem. NVIDIA states this issue should not affect many chips,[37] whereas others assert that all of the chips in these series are potentially affected.[37] NVIDIA CEO Jen-Hsun Huang and CFO Marvin Burkett were involved in a lawsuit filed on September 9, 2008 alleging that their knowledge of the flaw, and their intent to hide it, resulted in NVIDIA losing 31% on the stock m

  • by WaroDaBeast ( 1211048 ) on Monday May 30, 2011 @07:38AM (#36285476)
    Bah, the GTX 560M is just a refresh of the GTX 460M. It sports the GF116 chip instead of the GF106 and has got higher clocks, but that's all. *shrugs*
    • Yeah, there really isn't much different in the 400 and 500 series. I think this list shows it the best - Link [wikipedia.org]

      eg. The GT530 is the GT430. Not just like it, it is the same card. Futher down the list in the mobile series the 540M is the 435M just clocked a bit higher. etc. It's pretty much the same for the entire 500 series range. It's internally the same as the 400 series. Minor improvements if any.

      What interesting is that often the same 500 series card has a higher sub-model number than the equivalent 400 s

  • How comes everytime ./ reports something on new NVidia cards, it's about crappy mobile versions? I'm interested in true computing power!

    • by Hadlock ( 143607 )

      I wouldn't mind a direct comparison to their desktop counterparts, either. It's always so hard to equivocate the mobile versions to a more oft benchmarked version.

  • It seems to me that the sheer power of this mobile thing shadows the performance of my entire Centrino laptop :/

  • How many Mhashes per second does it get?
  • Now that we have that out of the way, how about you guys fix the bugs in the nvidia-96 driver for Linux? You know, the one that calls for xorg wrong?
  • by Behemoth ( 4137 ) on Monday May 30, 2011 @10:30AM (#36286458) Homepage

    Be wary on the Linux side of the 'Optimus' technology. I didn't do due diligence and impetuously ordered a new laptop from Dell with an nVidia card (GT 525M). Turns out that there was no way in the Dell laptop to turn it off, and Linux couldn't see the nVidia card, just the intermediating Intel card. The ‘automatic graphics switching’ is done in software only under Win7. End result - no OpenGL under Linux. End-end result, I sent it back.

    There is a project to get Optimus working on Linux (https://github.com/MrMEEE/bumblebee) but I really don't have time, and the switching has to be done manually at the moment.

    • by Anonymous Coward

      Be wary on the Linux side of the 'Optimus' technology. I didn't do due diligence and impetuously ordered a new laptop from Dell with an nVidia card (GT 525M). Turns out that there was no way in the Dell laptop to turn it off, and Linux couldn't see the nVidia card, just the intermediating Intel card. The ‘automatic graphics switching’ is done in software only under Win7. End result - no OpenGL under Linux. End-end result, I sent it back.

      There is a project to get Optimus working on Linux (https://github.com/MrMEEE/bumblebee) but I really don't have time, and the switching has to be done manually at the moment.

      In our Latitude R6420 series of machines, you can disable Optimus via a BIOS switch and it will forget the Intel card exists.....not sure if you really had a hard-on to use Optimus (which we certainly don't) .... but there is a switch to turn it off.

      • by Behemoth ( 4137 )

        I'll take a look at it. I have no use for Optimus - I just need a laptop with basic CUDA support. The model I unwittingly got was an Inspiron, and there didn't seem any way to disable Optimus. I'd love to be wrong (no RMA until after the holidays so it's not out the door yet) but I saw no options in the BIOS. I'm used to not being able to use the bleeding edge features in Linux, I'm just not used to being shut out completely, at least by nVidia products.

        Thanks m(__)m

  • Why would I upgrade when this is ~8900 less than my current 9400m?

    /averageuser

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...