AMD's Kaveri APU Debuts With GCN-based Radeon Graphics 123
crookedvulture writes "AMD's next-generation Kaveri APU is now available, and the first reviews have hit the web. The chip combines updated Steamroller CPU cores with integrated graphics based on the latest Radeon graphics cards. It's also infused with a dedicated TrueAudio DSP, a faster memory interface, and several features that fall under AMD's Heterogeneous System Architecture for mixed-mode computing. As expected, the APU's graphics performance is excellent; even the entry level, $119 A8-6700 is capable of playing Battlefield 4 at 1080p with medium detail settings. But the powerful GPU doesn't always translate to superior performance in OpenCL-accelerated applications, where comparable Intel chips are very competitive. Intel still has an advantage in power efficiency and raw CPU performance, too. Kaveri's CPU cores are certainly an improvement over the previous generation of Richland chips, but they can't match the per-thread throughput of Intel's rival Haswell CPU. In the end, Kaveri's appeal largely rests on whether the integrated graphics are fast enough for your needs. Serious gamers are better off with discrete GPUs, but more casual players can benefit from the extra Radeon horsepower. Eventually, HSA-enabled applications may benefit, as well."
How about competition on price? (Score:3)
Re: (Score:1)
Re: (Score:2, Insightful)
It's $200 cheaper than an i3 4330? That's pretty impressive given that the i3 is $130, are AMD going to refund me $70 for buying their CPU?
Re: (Score:2)
It's $200 cheaper than an i3 4330? That's pretty impressive given that the i3 is $130, are AMD going to refund me $70 for buying their CPU?
if so i think i am going to buy me enough cpu's to retire early.
Re: (Score:2)
Only way to properly compare the pricing is to include mobo, cpu, ram and gpu.
No other way around that.
Re: (Score:2, Troll)
and the graphic card is free for you?
Re:How about competition on price? (Score:4, Insightful)
Comment removed (Score:5, Interesting)
Re: (Score:1)
My only gripe with AMD is their quest to hit that higher perf at the expense of power consumption. The TDP on some of their chips are nuts.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Indeed, at the end of they what matters for 80%+ is the 80% of users - ie. Average Joes.
Very few people purchase the very top of the line, it makes absolutely no sense to pay triple for marginal gains - even if comparing within the intel brand. For a very few people it does make sense however.
We use quite a few AMD products in our DC - they are very solid, and very nice performance to price ratio.
Because AMD is not as much used, we don't have the multitude of choices, but the highest end difference is:
Intel
Re: (Score:2, Insightful)
Re:How about competition on price? (Score:4, Insightful)
For a lot of applications, per-core performance is what matters. And for the last few years, Intel beats AMD hands-down on per-core performance. As in 30-50% faster. That i3 for $200 is going to run rings around the AMD for $200. For a lot of single-threaded programs (many games are CPU-bound by a single thread), that 30-50% faster speed matters.
However, if your application is multi-threaded and the problem you are trying to solve (media transcoding) is easily done in parallel, then the AMD chips are a better fit.
The "Bulldozer" architecture was a dud. Lots of cores for cheap, but low performance per core under a lot of workloads. The Piledriver architecture is better and AMD is at least somewhat competitive again.
I'm very curious to see how well the new Steamroller (Kaveri) series chips perform.
Re: (Score:2)
Re: (Score:2)
Damn right!
And this goes for servers as well.
The bulk of nodes we sell are *first gen* ATOM. Yeah, first gen. And they idle mostly, since our workload is I/O intensive, not CPU.
Even the highend gear we purchase is 2 gens old - but it's still higher end than the last high end gear.
1U Dual Quad Xeon L5520 with 72Gb ram and less than 500$ and room for 4x3.5"? Who could resist that, when we used to pay close to 200$ per month for a Xeon W3560, 32G Ram with 2x2Tb drives.
And the CPUs offer more more power than th
Re: (Score:3, Insightful)
Benchmarks show that for pure CPU intensive tasks, the A10 APUs are roughly comparable to Haswell Core i3s (the entry level ones, at least). The i3-4150 costs $130-140, the last generation A10-6800K dropped to $130-140. The new A10-7850K is listed for $189 on Newegg. Considering this, the new A10-7850K is not very inciting at all. It's not even convincingly faster than A10-6800K, with the current drivers at least. AMD hinted that the new A10-7850K graphics performance will be on the level with Radeon HD7730
Re: (Score:2)
How about the Intel Pentium G3220? It's Haswell, socket 1150, low power and nearly half the price of the i3.
Re: (Score:3)
Yes. To add the insult to the injury, the G3220 is priced at $69 on newegg right now. It's basically a slightly lower clocked i3 without hyperthreading. If you don't play games or edit multimedia, then that's all you really need on an entry level desktop. Add a $100 video card, and it will probably run games at faster frame rate than AMD's $189 A10 Kaveri.
But also fully unlocked (Score:2, Interesting)
With the amd chips you also get full virtualization, encryption, ecc ram and overclocking.
Capable of Playing - worthless statement (Score:2)
You could play BF4 on a 300 MHz processor as long as you have enough memory, it would just look like a slide show.
Re:Capable of Playing - worthless statement (Score:5, Informative)
Most people, when they say "capable of playing", mean that it can actually be played on those settings, i.e. that the frame rate is high enough for the game to be considered playable. Generally, this means an average frame rate of ~30 and minimums of 20 or more (although that depends a bit on the reviewer, some people consider a frame rate of 30 totally unplayable, personally anything above 20 can still be played).
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Cooling an mini-ITX board should be easy. it's just a matter of the right case and cooling solution. You can even use a full ATX case if you really wanted to (the holes will line up). Not much you can do about a laptop.
Re: (Score:2)
Except that in this case "X settings" is 1080p30. It may be low quality otherwise, but it meets your requirement.
Looked into it for a friend's build (Score:4, Interesting)
I'm helping a friend with a custom, low-cost gaming machine. We'd looked into using an APU, and I looked into it again today when I saw this. The gaming performance just isn't there yet. They're fine for regular desktop use, but even the top-of-the-line one can't handle gaming.
The two things that could still be useful are GPGPU, and dual graphics. Having an on-chip GPU just for compute purposes, especially with all the enhancements they've added, would be very useful if more things used GPU compute, but it just wasn't worth it for this build and this user. And they have spoken a bit of using both the integrated GPU and a discrete graphics card in tandem, similar to using two GPUs in Crossfire, but they haven't released the drivers for it, nor listed which cards will work, and the card they chose to demo it with was their bottom-end graphics card. Given all that, and that a similar CPU without the integrated graphics was about half the price, I couldn't justify getting one.
I am pretty impressed with how tightly they've integrated them, though. Much better than Intel's offerings. If they made one that had the graphics horsepower for gaming, I'd have used one.
Re: (Score:2)
and the card they chose to demo it with was their bottom-end graphics card.
Probably because you wouldn't notice a difference if you paired a tiny integrated GPU with a powerful standalone one. The added overhead may even reduce performance.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Tech (Score:1)
The article also notes that a lot of the tech in these is new, so older games don't necessarily take advantage of it. It would be interesting to see how this looks a year from now.
Re: (Score:2)
The article also notes that a lot of the tech in these is new, so older games don't necessarily take advantage of it. It would be interesting to see how this looks a year from now.
Game developers optimize to run best on the fastest computers out there, not slow CPUs with slow integrated graphics. AMD would have to pay them to put effort into optimizing for these things.
Re: (Score:1)
Re: (Score:1)
Good game developers make games to run at reasonable performance on the most machines. it sells a lot more games that way...
Re: (Score:2)
Good game developers make games to run at reasonable performance on the most machines. it sells a lot more games that way...
But they won't go out of their way to implement special support for crappy hardware, because every game review and benchmark will be running on a high-end machine.
Re: (Score:1)
It sounds like you were actually trying to build a low-cost high-end gaming machine. That can't be done, it doesn't work like that. Look at what GPU and CPU performance AMD A-series gives you, assess if it's enough for you, and if it is then look at the price and pick up your jaw; this is the strength of the AMD A-series, good CPU and GPU performance at an amazing price in a single package.
If you want the best then you have to pay silly money for Intel and discrete graphics boards, that's just how it is.
Re: (Score:1)
I'm shocked GPUs, especially with all this integration, haven't taken over already. The whole reason Intel bought half the industry was it became obvious a Pentium core could be tucked into a tiny corner of a 3D graphics chip, both speed and transistor count-wisr, as their development, drivin by infinite potential consumption on cooler and more complex virtual worlds, would ever-more outstrip a general-purpose CPU.
Frankly, by now I was expecting a merged GPU/monster FPGA-type design with dynamic programmin
Re: (Score:1)
GPU computations hasn't taken off because APU still uses separate address space for CPU and GPU bits. It is a real bitch to copy from CPU to GPU and then to CPU back again. Yes, even on a single chip. Insanity!
Unified address space should fix the problems, but then memory protection need to be handled properly. Apparently coming "Soon" from AMD, but not soon enough.
Re: (Score:2)
Re:Looked into it for a friend's build (Score:4, Informative)
And they have spoken a bit of using both the integrated GPU and a discrete graphics card in tandem, similar to using two GPUs in Crossfire, but they haven't released the drivers for it, nor listed which cards will work, and the card they chose to demo it with was their bottom-end graphics card.
That's not very truthy:
http://www.amd.com/us/products/technologies/dual-graphics/pages/dual-graphics.aspx#3 [amd.com]
Re: (Score:2)
That link is not very truthy. Not only does it only list a single "recommended" card, rather than a list of any that are compatible, but it also has not been updated for these new GCN-based APUs. As noted in TFA (the Anandtech one, specifically) "AMD recommends testing dual graphics solutions with their 13.350 driver build, which due out in February."
Re: (Score:2)
Looking forwards... (Score:3)
Really looking forwards to the HSA benchmarks.
Nothing out there will tax these chips. All GPGPU codes are written asuming hugh latency between CPU and GPU. With shared caches these things have nanosecond latency and should beable to bring the GPU to bear on a much wider class of algorithms.
Now, it's always worth shipping the data to the GPU, since if it's in the L2 cache, it's there for the GPU as well.
It will take a while before people code to this though.
Re: (Score:2)
So you mean kind of like what the Intel chips already do?
Re: (Score:1)
Re: (Score:2)
Sort of like "Intel InstantAccess", that allows the CPU to directly access GPU memory space?
It was a driver limitation, not a hardware one
Re: (Score:3, Insightful)
Re: (Score:1, Flamebait)
So you knew how it works and you just decided to spread lies in your previous post?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
the ps4 has high speed ram sheared for video and cpu.
Xbox has slower desktop DDR3 for video / cpu.
PS4 seems good and high speed ram makes it better then other on board chips that use the slower desktop ram.
Re: (Score:2)
I am a little bit skeptical about that. I am not really sure how much it will really change things. The use case actually seems very thin to me. You need a kernel which is compute intensive and where the data transfer from memory to the core is expensive. Because if there is little data to transfer, then the overhead is small. I read some benchmarks from AMD and only few kernels seemed to be in the sweet spot. On top of that becasue of the memory architecutre, I feel like raw memory to core bandwidth will b
Re: (Score:2)
You need a kernel which is compute intensive and where the data transfer from memory to the core is expensive.
You're missing the cpu-gpu latency.
So, the integrated GPU will already work as well as any other GPGPU of similar specs. Nothing has got worse there.
There is a problem with writing GPU code in that some things work far better on the GPU and other things work far worse than a CPU. At the moment writing code which has a mixture of those is extremely hard.
Basically, this architecture will allow one to
Re: (Score:2)
You're missing the cpu-gpu latency.
The CPU-GPU latency is already very small, smaller than a millisecond. and I assume that most of the CUDA startup latency comes configuring the kernel launch and not from the interconnect. Because PCI-express has very low lateny, infiniband card get network communication with a latency less than 2 microseconds. That's about 6000 CPU cycle (assuming 3GHz CPU).
If you want to gain by removing latency, you need the computation to be VERY small and frequent.
Moreover, we are very good at overlapping communication
Re: (Score:2)
That is going to be the kicker. Just like VLIW it will really depend on software tools and support. AMD is supporting a lot of FOSS projects that support OpenCL. Maybe AMD needs to throw some support to WebKit and Mozilla to support their GPU compute systems.
Disappointed (Score:2)
While the GPU is good, the Kaveri CPU is slightly slower than Richland in the benchmarks - after 4 years of waiting that's a big disappointment.
Re: (Score:2)
Re: (Score:2)
In other words, why weren't most of these improvements included back in 2011?
Developing hardware is a lot different than developing software. With software you can go - "oh that now works, lets add this" or "oh, that didn't work out - how about we take that out". With hardware you can't without going back to the start of the manufacturing process.
With hardware a large part of the exercise in risk management - adding one feature that you can't get production ready will kill the entire product. So most projects pick just one or two key areas to develop, the ones that will make the big
The COST difference should be mentioned. (Score:4, Informative)
It's 1:2 AMD:Intel, at the kindest level.
It's 2:3 with radeons:nvidia.
APU Name (Score:2)
Re: (Score:2)
In Finnish kaveri mean buddy. Quite fitting name :)
And "apu" means help or assistance, or auxiliary as a prefix. For example "apuprosessori" meaning co-processor.
Who fabs this? (Score:2)
Re: (Score:1)
GloFo
Re: (Score:2)
Embedded GPU Boom (Score:3)
Re: (Score:2)
Good luck getting a 100W CPU and 300W GPU into the same package.
Well, OK, stuffing them in there won't be too hard, but cooling it will be a bastard.
Re: (Score:2)
You may be the only person on Earth that sees high power consumption as a desirable feature.
You do realise that high-end CPUs and high-end GPUs use a lot of power, yes? You do realize that putting both in a single package would use even more power, yes?
Oh, obviously not, since you've completely mis-read the point of my post.
Re: (Score:2)
When you can't use all of your transistors at the same time, you need to make sure the transistors you are using are the most efficient at getting the current work done. Enter Heterogeneous computing. The GPU is much more efficient than the CPU at certain types of work, talking a
Re: (Score:2)
You may be the only person on Earth that sees high power consumption as a desirable feature.
Not the only one. I'm currently mining for litecoin just to keep the house warm ;)
Best choice for 4 out of 5 desktop users (Score:4, Insightful)
Most of the people who decide they still need a full-sized desktop computer will be completely covered with one of the AMD A-series APUs, at a bargain price. Only the remaining 1 out of 5 users are power-users who need the highest CPU and/or GPU performance, and have to resort to expensive Intel CPUs and discrete graphics boards.
Re: (Score:2, Insightful)
Expensive Intel CPUs? Intel's Core i3 is pretty much equivalent to the AMD A10 on general purpose CPU power. Right now, i3-4130 is $129 on newegg while the A1-7850K is $189. The only thing that the A10 has on Core i3 is integrated graphics, but throw a $100 Radeon card into either of these systems, and it will run much faster than the integrated graphics on the A10. And don't forget the dual core Haswell Pentium chips sold for under $100. A Pentium G3220 costs $69 on newegg right now. Add a $100 Radeon HD77
Re: (Score:2)
Except the point is not to need the $100 graphics card, even if that means sacrificing some CPU performance that that segment doesn't need.
Re: (Score:3, Insightful)
As we so often talk about the death of desktops (Score:3)
Already tested by Anandtech (Score:1, Troll)
It's a pretty sad reading IMHO. The Kaveri APU does not seem even decidedly faster than the last generation A10. The only bright spot is that the 65watt TDP A8 APU is not that much slower than the 95watt A10 APU.
Am I the only one who wants a *CPU*? (Score:3)
Re: (Score:2)
Well, at least Intel is not charging a huge premium for the integrated graphics. The Core i3-4150 is only $130 and the rest of Core line use the same basic GPU.
Re: (Score:2)
Re: (Score:2)
Laptops (Score:2)
The trend away from desktops to laptops continues. Both AMD and Intel design with this in mind. With Intel the biggest improvement to Haswell was power consumption, which really on a desktop is meaningless for the most part. Both are trying to greatly improve their integrated graphics, because making laptops with dedicated cards is expensive, and you can sell more of the cheaper ones. It is easier to make one more less design.
If you are buying a desktop for gaming the integrated graphics are almost useless
What GCN stood for before Graphics Core Next (Score:3)
Re: (Score:2)
I love that game. Basically the only reason I bought a Gamecube and 4 controllers.
Re: (Score:1)
The legendary (for its time) Radeon 9700 Pro, the card that kept ATi from going the way of 3dFX and the other failures, was GCN-based, if by GCN you mean the Gamecube.
ArtX, a startup created by SGI refugees, created the architecture for that chip and then got bought out by ATi partway through the console's development. The architecture then became the R300, which hilariously outperformed not only the contemporary nVidia GeForce 4 series, but also nVidia's followup "GeForce FX". ATi had both the performance
Nintendo Power used GCN (Score:2)
Re:Sadly, a near total disaster for AMD (Score:5, Insightful)
Yes, it is not a very good GPU when it comes to high end graphics because it has about 1/3rd the flops of a discreet GPU and it is memory bandwidth starved for those work loads, but for non graphics related work loads, it's perfect. It is the first of something new. How many people piss and moaned about FPUs when they came out? "derp, there's no software that uses them, so they must be useless". You need to have the platform before you can have the developers. Once the next gen consoles start taking off, expect games to be nearly directly ported and taking advantage of this new GPU paradigm.
Re: (Score:2)
I've been hearing for years that AMD's Linux drivers are just around the corner. Still waiting...
Maybe SteamOS will get them off thier butts, but for the time being my money is still going to nVidia.
AMD performance in Linux is 3-10 times slower than Windows in most games I've tried on Llano. I love my Llano laptop outside of gaming, but it pains me to still dual boot, whereas my desktop has been Windows-free for 6 years.
Radeon Drivers are getting better. (Score:2)
Given that I don't play latest and greatest games (there are plenty of good 3 year old games), performance is
Re: (Score:2)
> PS I am sickened- SICKENED- that AMD refused to build motherboards with the RAM soldered on, because then Kaveri could have utilised the same 256-bit GDDR5 solution found in the PS4, for no more money (save the cost of the RAM, obviously). A GDDR5 Kaveri would EXTERMINATE every competing Intel part.
Actually I'm also surprised AMD isn't doing that. Might be that the existing architecture of separate RAM is "good enough" and they don't want to pursue a tiny market where only console devs care about perf
Re: (Score:2)