Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Trinity A10-4600M Processor Launched, Tested 182

MojoKid writes "AMD lifted the veil on their new Trinity A-Series mobile processor architecture today. Trinity has been reported as offering much-needed CPU performance enhancements in IPC (Instructions Per Cycle) but also more of AMD's strength in gaming and multimedia horsepower, with an enhanced second generation integrated Radeon HD graphics engine. AMD's A10-4600M quad-core chip is comprised of 1.3B transistors with a CPU base core clock of 2.3GHz and Turbo Core speeds of up to 3.2GHz. The on-board Radeon HD 7660G graphics core is comprised of 384 Radeon Stream Processor cores clocked at 497MHz base and 686Mhz Turbo. In the benchmarks, AMD's new Trinity A10 chip outpaces Intel's Ivy Bridge for gaming but can't hold a candle to it for standard compute workloads or video transcoding."
This discussion has been archived. No new comments can be posted.

AMD Trinity A10-4600M Processor Launched, Tested

Comments Filter:
  • by sl4shd0rk ( 755837 ) on Tuesday May 15, 2012 @03:58PM (#40009175)

    That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.

    • by confused one ( 671304 ) on Tuesday May 15, 2012 @04:14PM (#40009337)
      And they are, as long as you understand that they are not trying to compete at the level of a core i7. If you need that kind of x86 performance you have one choice, Intel, and you will pay their premium tier pricing to get it... AMD stumbled with the release of the FX series, hopefully as they move forward they will remain competitive.
      • Re: (Score:3, Interesting)

        by Lumpy ( 12016 )

        Sorry but the 8 core FX kicks the crud out of the quad core i7 that is the same clock speed. I actually USE a pc for video editing rendering and 3d rendering and the new 8 core machine with one FX processor is kicking the arse of the i7 machine.

        Granted i'm actually using multi threaded software unlike most people, but saying that the i7 is the end all to computing performance is not true.

        • by Kjella ( 173770 )

          Sorry but the 8 core FX kicks the crud out of the quad core i7 that is the same clock speed. I actually USE a pc for video editing rendering and 3d rendering and the new 8 core machine with one FX processor is kicking the arse of the i7 machine.

          Mind telling us what applications you use? Because the 3.6 GHz FX-8150 loses [anandtech.com] to the 3.5 GHz i7-3770K in all of these (or 4.2 GHz vs 3.9 GHz if you want to compare turbo speeds), sometimes massively:

          SYSMark 2012 - Video Creation
          SYSMark 2012 - 3D Modeling
          DivX encode
          x264 encode - first and second pass
          Windows Media Encoder 9
          3dsmax (7/7 benchmarks)
          CineBench R10 (single and multithreaded)
          POV-ray SMP benchmark
          Blender Character Render

          Of course if you take the slower FX-8120 and compare it to the same clockspeed i5-

    • by gl4ss ( 559668 )

      it beats intel(presumably more costly intel too) in gaming easily.

      thanks to intels shitty gpu.

      no surprises there, then.

      • What about when I use a more powerful, discrete graphics card?

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          good luck cramming that into a tablet or 9" laptop.

          people under 30 don't use towers. tablets and notebooks. small notebooks.

          • by Nutria ( 679911 )

            people under 30 don't use towers.

            They do when their employer points at a cubicle and says, "Sit there. Use that PC."

            • An employer that provides a tower can go Intel. Most of the time, an Intel GMA (Graphics My Ass) is OK because the employer doesn't want the user playing 3D games on company time. In other cases, the employer provides a discrete card because it anticipates use for CAD, 3D graphic design, or video game development and testing.
              • An employer that provides a tower can go Intel.

                I disagree with this. The typical office user (non-engineer, non-programmer, non-graphics designer, non-AudioVideo designer... but basically a web client operator, e-paper pusher, email reader, calendar checker, etc.) would be fine with a machine about as powerful as today's most powerful smart phones. I never understood when its time to replacing aging hardware for the run of the mill office worker why companies always tend to go for the middle/top of the line boxes when it ends up being so much more power

          • by Lumpy ( 12016 ) on Tuesday May 15, 2012 @05:22PM (#40010261) Homepage

            "people under 30 who really dont do anything with their computers but websurf don't use towers. tablets and notebooks. small notebooks."

            Fixed that for you. Every person I know under 30 that actually uses a computer has a tower. they need to do things like Render 3d GFX for static images or movies, high end photography, video production. even the CAD/CAM geeks have a tower.

            I know plenty of under 30 professionals that actually use a computer to the point that they need a tower, It seems you don't, you might want to hang around smarter people.

            • Web development, app development, graphics editing, audio editing. You have a circular argument -- or a No True Scotsman if you prefer that term.

              The list of computing tasks for which a powerful desktop machine is necessary is vastly smaller than the list of computing tasks, as evidenced by hardware sales. This trend is increasing. At some point large computers will be both expensive and rare, and I for one won't mind that if it means the end of fixing desktops. Users can send their tablets back to the manuf

            • I'm a professional software developer. I have an i5 laptop with built-in graphics, 8GB of memory, a couple of external displays, and a gigabit link to 2TB of NAS. Why would I need a tower?

              I don't game much anymore, and when I do most of it is on my tablet anyway. My laptop is perfectly respectable for doing office work, compiling large amounts of code, doing photography work, and hobbyist CAD work in sketchup. It decodes high def video mostly in hardware with minimal overhead.

              I have no desire for gaming

              • by cynyr ( 703126 )

                swap that i5 out for a high end 8 core amd or i7 quad, bump up to 16GB of ram and SSD and see how much faster things will compile... It does mean that you will have to have a highly parallel build system, but that's the price you pay.

              • by laffer1 ( 701823 )

                You value your time? A good desktop can blow away an laptop CPU. I think that will change as we get new AMD and Intel parts as they don't care about performance but rather targeting mobile devices.

                For software I build, an older desktop finishes 20 minutes earlier than the laptop. Both are AMD systems. I think you need to qualify what type of software development you do. It doesn't matter much if you write php code or small web apps in any language. Anything of real substance requires some oomph.

          • by cynyr ( 703126 )

            I'm under thirty, and have a desktop* as my main computer... I'm not ready to cut out the desktop for a laptop yet. I can't ever seem to get enough grunt in a laptop for even a 50% markup over a home built desktop.

            *as long a a AMD Phenom II 1055T X6 in an Asus mini-itx board in a silverstone SG05 counts as "desktop".

        • by Nutria ( 679911 )

          Or Linux, where ATI performance suffers compared to Nvidia

          I've been exclusively AMD+Nvidia since the K6-2 & Riva TNT2 days, but my next mobo will be Intel.

          • I'm confused how this is true, all my XBMC boxes are Linux, some are AMD and some use NVidia for graphics referring mostly to the Atoms though so all graphics is integrated. I have not seen any performance issues with the AMD drivers in Linux. These days both Nvidia and AMD support seems to be pretty good unless you have a top end latest model discreet GPU.
            • by Nutria ( 679911 )

              All I've ever read is how buggy the ati/amd drivers are and how support lags severely for kernels and cards and 3D is really slow. OTOH, the nvidia driver always supports the current stable kernel and cards.

              If I were making an xbmc box (which I wouldn't since the Linux-based Iomega 35045 is only $105) then I'd have an Atom CPU and a GeForce 210 and install the binary driver and vdpau libraries so that it will off-load video decoding.

              • by cynyr ( 703126 )

                so will the gforce 210 do DTS master audio over HDMI and does it have enough grunt for high bit rate 1080P h264 streams (35-40Mbps)?

                • by Nutria ( 679911 )

                  DTS master audio over HDMI

                  Dunno.

                  does it have enough grunt for high bit rate 1080P h264 streams

                  According to this [wikipedia.org] page, yes it can. However, I can't confirm it since I don't have any Blu-ray disks. (High-bitrate MP4s ripped from DVDs look perfectly fin on my 32" LCD, so I see no reason to buy BDs or a BD player.)

                  What I do know is that mplayer *never* breaks a sweat while playing "high RF" (maybe that's a term specific to Handbrake) MP4s.

              • I have two PCs at home with discrete GPU cards. Both have a similar processor, same amount and kind of memory, and very similar motherboard. One has a NVidia card, the other has an AMD. You can't tell the difference by using them, when running some physics simulation I found the AMD computer faster (altough it has a just a bit slightly slower CPU), and when installing Debian the AMD GPU just worked (it still tooke some setup for the physics simulation), while I had to install the NVidia by hand (ok, just ch

              • What happens when people are complaining about AMD's linux support and praising nVidia, is that they miss an important piece of the narrative. AMD releases their hardware specs/references (whatever, I forgot what their called, and am too tired to google), while nVidia does not. The Nouveau (OSS nVidia driver) is a huge effort of reverse engineering that STILL does not support 3d acceleration while the radeonhd drivers support legacy cards (and reasonably mature gear) with reasonable 3d performance (compared
                • by Nutria ( 679911 )

                  I wish someone would come up with a driver that would allow A-series APUs to run Linux reliably without graphical issues.

                  Which is why my next CPU will be Intel, since I don't want to pay for something I won't use.. Not that I'll need one for quite a while. 4 CPUs and 8GB RAM is just... enough.

                  And to think that I was happy with a KayPro IV, Borland Pascal, Wordstar, an Anchor Signalman modem and a Star Gemini 10X printer...

                  "Stability and transparency are important", then radeonhd drivers are awesome.

                  Maybe I don't push the envelope, but the nvidia blob has been stable for me for ages.

                  Regarding transparency, sure I want it, but I accept that ideological purity is impossible to achieve, so I take 3/4 of

        • Go Intel in that case.

      • it beats intel(presumably more costly intel too) in gaming easily.

        No it doesn't. The summary says it does, then links to an article that says this:

        After the 3DMark results, you might be wondering if Intel has finally caught up to AMD in terms of integrated graphics performance. The answer isâ¦yes and no. Depending on the game, there are times where a fast Ivy Bridge CPU with HD 4000 will actually beat out Trinity; there are also times where Intelâ(TM)s IGP really struggles to keep pace...

        • by mrjatsun ( 543322 ) on Tuesday May 15, 2012 @04:31PM (#40009569)

          > Ivy Bridge and Llano actually ended up 'tied

          Yes, but Llano is the *old* AMD processor ;-) Check the reviews for performance of a HD 4000 vs a Trinity.

          • by timeOday ( 582209 ) on Tuesday May 15, 2012 @05:14PM (#40010143)
            I shouldn't have quoted that second sentence about Llano, but the first sentence was specifically about Trinity. Here is the follow-on:

            This chart and the next chart will thus show a similar average increase in performance for Trinity, but the details in specific games are going to be different. Starting with Ivy Bridge and HD 4000, as with our earlier game charts we see there are some titles where Intel leads (Batman and Skyrim), a couple ties (DiRT 3 and Mass Effect 2), and the remainder of the games are faster on Trinity. Mafia II is close to our 10 percent âoetieâ range but comes in just above that mark, as do Left 4 Dead 2 and Metro 2033. The biggest gap is Civilization V, where Intelâ(TM)s various IGPs have never managed good performance; Trinity is nearly twice as fast as Ivy Bridge in that title. Overall, it's a 20% lead for Trinity vs. quad-core Ivy Bridge.

            So, AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases. AMD's integrated GPU is still a little better normally, but it's not a slam dunk any more.

        • by Anonymous Coward

          the Ivy Bridge (Asus N56VM) is $1200 (MSRP) to $1300 on some preorder site. pre launch marketing (heh) claimed that a Trinity lappy might be $600-700. who knows, tho.

        • by serviscope_minor ( 664417 ) on Tuesday May 15, 2012 @05:16PM (#40010185) Journal

          AMD's integrated GPU advantage is gone.

          That's also compared to the more expensive i7 part. There was no i5 or i3 comparison.

    • Assuming we're not including discrete graphics card, if you want gaming performance, AMD wins. If you want video encoding or photo editing performance, Intel wins. For most people who have PCs, it doesn't matter because the CPU and graphics are already fast enough for anything there going to do on it.

      Personally, I'm going with an Ivy Bridge, nVidia 680 GTX combo. If I was going for a single chip solution, I would probably go with AMD.

      • Personally, I'm going with an Ivy Bridge, nVidia 680 GTX combo. If I was going for a single chip solution, I would probably go with AMD.

        I bet if you were going for an energy efficient solution, you'd probably also go with AMD... unless you didn't mind embarrassing performance and feeling like its the year 2000 all over again.

      • by cynyr ( 703126 )

        Next rig is looking like intel (as soon as they start giving me sata3 and USB3 only), on mini-itx. AMD seems to not care about the form factor at all. Granted in either case it will have an nvidia GPU because I like working graphics in wine and linux.

    • by asliarun ( 636603 ) on Tuesday May 15, 2012 @04:23PM (#40009457)

      That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.

      IMO, the Trinity is a truly compelling offering from AMD, after a long long time. Yes, it trades lower CPU int/float performance for higher GPU performance when compared to Ivy Bridge, but this tradeoff makes it a very attractive choice for someone who wants a cheap to mid-priced laptop that gives you decent performance and decent battery life while still letting you play the latest bunch of games in low-def setting. Its hitting the sweet spot for laptops as far as I am concerned. I'm also fairly sure it will be priced about a hundred bucks cheaper than a comparable Ivy Bridge - that's how AMD has traditionally competed. Hats off to AMD fror getting their CPU performance to somewhat competitive levels while still maintaining the lead against the massively improved GPU of the Ivy Bridge. All this while they're still at 32nm while Ivy Bridge is at 22nm.

      Having said that, what I am equally excited about is the hope that Intel will come up with Bay Trail, their 22nm Atom that I strongly suspect will feature a similar graphics core that is there in Ivy Bridge. Intel has always led with performance and stability, not with power efficiency and price, so they need to create something that genuinely beats the ARM design, at least in the tablet space if not in the cellphone space.

    • by Kjella ( 173770 ) on Tuesday May 15, 2012 @04:35PM (#40009625) Homepage

      Well, they'll sell them at the prices that they sell at, it's not like a CPU ever has a negative margin. The question is if that's good enough in the long run to keep making new designs and break even. Particularly as Intel is making a ton of money on processors that AMD can't compete against. Their Ivy Bridge processors should cost about 75% of a Sandy Bridge but sell for 98% of the price. Intel now has huge margins because AMD can't keep the pressure up, it's not really helping AMD to surrender the high end because it only gives Intel a bigger war chest.

      This launch is okay, it's all around much better than Llano and keeping a fair pace with Intel, but it obviously tops out if you want CPU performance. What will be interesting to see it next year when Intel will have both a completely new architecture for the Atom and be on their best processing technology. Then I fear AMD may be seeing the two-front war again, both on the high and low end. Right now the Atom is a little too gimped to actually threaten AMDs offerings. I expect Intel just wants AMD crippled, not killed though to avoid antitrust regulations, so I think they'll be around while Intel makes all the money.

  • by Ancil ( 622971 ) on Tuesday May 15, 2012 @04:04PM (#40009239)
    They should do that the other way round.
  • by Anonymous Coward on Tuesday May 15, 2012 @04:13PM (#40009323)

    I've seen a lot of reviews of various laptops that have missed the most important metric in this competition - Price!

    What's been common in all reviews is that the only the very top end Intel "integrated" (No separate, discreet GPU) solutions have been competitive to the new fusion products. We're talking mobile i7s. I don't know if you've priced laptops lately, but the i7's are only found in expensive, high end systems.

    The fusion APUs are nowhere near that expensive. Price wise, they should be compared to i3s or "pentium" mobile cpus.. Where they will win quite handily!

    It turns out that AMD's 'APU' solutions have been very popular with low end device makers and AMD sells them by the boat load. What's impressed me, however, is how much intel has improved their GPU in ivy bridge. It's always been garbage before, but now it's starting to be something you could call 'low end'.

  • I don't transcode and my Excel sheets aren't that complicated. I suspect that most people are like me, we do basic work and play a game or two. I play TF2 on my laptop, it's 3 year old laptop with a new SSD. Plays fine. I can't think of the last time that I was truly CPU limited. I've been GPU limited since Crysis. I can't play that beyond low detail level.

    • I got my wife an acer 10.6 inch thing somewhere between a nettop and a laptop.. She loves it.. that little AMD 350 CPU pulls 9 watts of power, so this little thing has great battery life (about 7 hours for our usage). Plays video fine, since it has a decent video chip (not great) built into the CPU. No heat.. no loud fans that kick on all the time.. she really digs it.. NOt bad for a $350 laptop at costco.

  • by chrysrobyn ( 106763 ) on Tuesday May 15, 2012 @04:16PM (#40009373)

    From what I've read, on CPU tasks it's between an i3 and an i5. An i3 is "fast enough" for most general use, so I think that's pretty good. On GPU tasks, it's significantly faster than Intel's integrated chipsets, knocking on the door of respectable gaming performance if not walking into the room.

    If you're doing CPU tasks, you really want the i7. If you're doing hard core gaming, you're also going to want the latest generation video card, even if it's an entry model. If your budget is less than $700 and you still want to play video games, Trinity is a good compromise. I think it's perfect for college students.

    • by Rinikusu ( 28164 )

      Honestly, since ditching my desktop, I've been loving my A-series A8 based laptop (upgraded it from an A4). I get respectable gaming performance, and it's perfectly fine for my music and media creation, although I will say that if I were a music and media pro I'd probably fork out the dollars for a real rig. It does everything I need it to do decent, the price was certainly right, and for anyone looking in the $500 laptop market that needs some graphics ability and isn't crunching a lot of numbers (i.e.,

    • I'd trade GPU for CPU any day. My 5 year old laptop's CPU is good enough most of the time.
  • What I find most impressive of AMD's APUs is that they made basic gaming on sub-$400 laptops possible.
  • So far I have seen no mention of it, but would this not make a great HTPC platform?
    Very low powered CPU but a tank of a GPU sounds great to me... Especially when your box is idling.

    Any thoughts from someone more knowlegable? I'm still like 5 generations behind running an AMD X2 5200+.

    • Re:HTPC (Score:5, Informative)

      by Rockoon ( 1252108 ) on Tuesday May 15, 2012 @07:08PM (#40011367)
      All of AMD's A-series processors make a great HTPC platform. Its been over a year now with Intel not offering any real competition at all in this segment once price is factored in. You can trivially get a full 65W A-series HTPC box up and running for under $150 with lots of headroom (thats the price I would quote to friends/coworkers and pocket the difference as labor costs.) The higher end A-series (100W) are only necessary if you are gaming.

      ' Some might say that Intel Atom solutions are price competitive with the A-series but the Atom solutions, just like AMD's low powered E-series lineup, really only works well for HTPC as long as 100% of your needed video codecs use GPU acceleration. If the Atom is good enough, then an E-series of the same price will be a bit better as well. Its hard to guarantee that all the codecs that you will be using will be GPU accelerated, especially so if you are stacked up on a Linux distro, so the E-series and Atoms are not really a solution that I recommend.
      • Its hard to guarantee that all the codecs that you will be using will be GPU accelerated, especially so if you are stacked up on a Linux distro, so the E-series and Atoms are not really a solution that I recommend.

        I do use Linux. Gentoo, to be precise. I am not sure what you mean when referring to Linux there. Could you please clarify?

        Do you have any recommendations as I am looking at building an HTPC. No capture or anything fancy, I have a huge media Library on my main PC I will mount over NFS. I had an idea to build a "retro" system into an old VHS player I have shoved in the basement. Space is very adequate, I could probably pack a full ATX PSU in there with plenty of room to spare with an ITX board. Looking to sp

  • 246 mm die size! That sucker's big! Ivy bridge was big already at 166mm.... Wonder what the pricing will be....

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...