Intel Challenges ARM On Power Consumption... And Ties 163
GhostX9 writes "Tom's Hardware just published a detailed look at the Intel Atom Z2760 in the Acer Iconia W510 and compared it to the NVIDIA Tegra 3 in the Microsoft Surface. They break it down and demonstrate how the full Windows 8 tablet outperforms the Windows RT machine in power consumption. They break down power consumption to include the role of the CPU, GPU, memory controller and display. Anandtech is also reporting similar findings, but only reports CPU and GPU utilization."
Despite repeated claims that x86 is beating ARM here, they look neck and neck. Assuming you can make a meaningful comparison.
Re: (Score:1)
An understandable oversight since it's Christmas Eve.
Re: (Score:2)
My God, I'm an insensitive clod... sorry mate.
Neck AND Neck (Score:5, Informative)
Despite repeated claims that x86 is beating ARM here, they look neck in neck.
It's neck and neck [thefreedictionary.com].
Re: (Score:3)
How about "equal"? A nice short word that is far more informative than an analogy to horse races, an event that no slashdotter has ever attended. Horses haven't been in use in a hundred years, it's time to get rid of horsey verbiage.
Re: (Score:1)
Let's get rid of that stupid "horsepower" measurement too.
Re: (Score:1)
Aussies do not pronounce it neck ain neck.
it is much closer to say it's pronounced neck 'n' neck (like rock 'n' roll)
they look neck in neck (Score:4, Informative)
It's "neck and neck" as in a pair of horses very close together at the finish line.
Sigh
Re: (Score:3, Interesting)
Re: (Score:2)
Neck in Neck seems like a more internet appropriate version. As in a series of images tucked away in a dark corner of imgur, briefly referenced on reddit before being removed by admins. Neck in Neck - "A filthy, gritty internet version of Neck and Neck."
Yes, but this just begs the question as to who we make the escape goat here on /.
Re: (Score:3)
The author must have written the summary while standing online.
Re: (Score:2)
neck 'n' neck
'neck in neck'? (Score:4, Informative)
Oh for crying out loud: Neck and Neck.
Often used when describing two racers that are nearly even in position.
Neck and Neck is advantage Intel (Score:4, Interesting)
If two processors are Neck and Neck in power consumption and one of them is x86. It means x86 is ahead. It's got better clock speeds and it's got more software going for it than arm. Yes we have a lot of android apps, but I would rather have my windows applications to those "apps" and their private internet. Unless Neck and Neck is for a processor intel does not produce any more, it's clearly advantage intel.
Re: (Score:3, Interesting)
Two processor are neck and neck. One costs $120 and the other costs $20.
Which one has a brighter future?
Especially now since people don't need to run all sorts of software. They just need android.
Re: (Score:3, Informative)
Re:Neck and Neck is advantage Intel (Score:5, Insightful)
If the execs and the sales guys want their Apple devices, or Android devices for that matter, what the IT organization thinks is 100% irrelevant. I've seen this happening already in quite a few large organizations that aren't particularly famous for being early adopters in new tech. Next thing to go are the standard windows images - corporate images are normally poor quality that people complain about constantly.
Re: (Score:3)
Those won't be buying ARM that is for sure.
Because Windows RT does not support any of these things. Only the Intel version does.
That also means, if it's Bring-Your-Own-Device situation they'll be bringing the ARM-version.
This is going to be fun to watch.
Re: (Score:2)
No, it doesn't.
Why doesn't it mean x86 is ahead? Because x86 has had years of development ahead of ARM. Also because x86 uses proprietary microcode.
So having them equal means ARM is a significant benefit.
Re:Neck and Neck is advantage Intel (Score:5, Insightful)
No, it doesn't.
Why doesn't it mean x86 is ahead? Because x86 has had years of development ahead of ARM. Also because x86 uses proprietary microcode.
So having them equal means ARM is a significant benefit.
The original x86 was introduced in 1978.
/. crowd and other self described "experts" have been saying for years that a neck and neck tie between them for power consumption would never happen. And well it did, so obviously this is a win for the x86.
The original ARM was introduced in 1985.
That is just 7 years more over the ARM with 27 years of development since the first implementation. Plus all of the
Re: (Score:2)
x86 is a pyramid of kludges; ARM is alleged to be a clean design. That the designs are so close in effectiveness indicates that there really isn't a great difference in the system-level value of the designs, at least in the tests performed.
General-purpose processor design is a heavily studied and fairly well understood body of information, and comparing 40 years of development with 30, 20, or 10 is irrelevant.
Re: (Score:2)
The ARM is supposed to be a cleaner design and the x86 is a kludge with backwards compatibility going back well over 30 years, but a tie with a processor that's a process node behind isn't too hot. The Intel is 32nm, the ARM is 28nm. The ARM should be better. They're doing the same thing.
Re: (Score:2)
it's got more software going for it than arm
This product comparison [microsoft.com] from Microsoft leads me to believe that applications have to be rewritten to behave correctly on Windows 8 Pro. Notice the blurb about downloading apps from the Microsoft store. This does not say you can download any plain old exe file. The mention of Windows 7 applications could be those that have already been rewritten to be compatible with the tablet.
If that's the case, iOS and Android apps witten for the ARM architecture greatly outnumber those for x86.
Re: (Score:3)
You would be incorrect. Windows 8 Pro runs any old executable that ran on Windows 7, you don't need to recompile or anything.
Re:Neck and Neck is advantage Intel (Score:4, Insightful)
Re: (Score:1)
you're a moron. I am running windows 8 pro and have all my old software operating just fine without any Windows store purchases.
Re: (Score:3)
Intel actually had to write an ARM emulator for their Android stuff because ARM has a very definitive software advantage over x86 there. Sure, there's lots of x86 desktop applications, but how many of them are usable on a tablet? On a phone? For that matter, how many of them can be used without adding the substantial cost and system resource usage of a full Windows install?
Re: (Score:2)
For Windows 8 vs Windows RT, maybe the tie goes to Intel. On Android, the tie certainly goes to ARM. But keep in mind, you're comparing a quad core ARM to a dual core x86, and this isn't even the best ARM for comparison anymore.... plus, the Surface doesn't even use the faster T33 verson. The Atom in question doesn't have a CPU speed advantage, but it has huge memory bus advantage over the Tegra 3: nVidia's single 32-bit DDR3 bus versus Intel's dual 64-bit DDR3 bus. A comparison o the Nexus 10 might be mor
Doesn't mean a thing (Score:3, Interesting)
Even if true (watch out for cognitive dissonsoance with respect to Intel power efficiency claims) it does not mean a thing if Intel cannot match the price. Currently something like $1 goes to ARM holdings per chip. Lets see a bloated old monopolist get by on that.
Re: (Score:2, Interesting)
Nope, they get MORE than $1 a chip, which means they have more to plow back into R&D. Truthfully though, its an interesting question, but all told unless the price is substantially different we're not talking a big deal. If you pay $5 more for your x86 tablet you won't really care, assuming it works at least as well and happens to have the features you wanted/be the brand you like/etc.
I think the question is whether Intel will be able to push the x86 design down to EXTREME low cycles/watt levels. x86 ha
Re: (Score:3)
all told unless the price is substantially different we're not talking a big deal. If you pay $5 more for your x86 tablet you won't really care
You're in outer space. Intel can't get by on $5/tablet, they need at least $50 or they will soon need to sell their head office. There is no way Intel can compete with ARM's royalty structure while continuing to live in the manner to which they have become accustomed.
Re: (Score:2)
There's more to it than that. Embedded chips are small and cheap, and sell in great numbers. Of course it is going to be true that a shift in the market will bring changes in everyone's business I'm not at all sure Intel can't bring in tons of money still. Its a complex situation, going to a new level of commoditization.
Re: (Score:2)
Correct me if I'm wrong, but ARM may only take $1 a chip, but they're only the designer. The manufacturer must be taking a cut too- including a cut big enough to cover the manufacturing costs.
Intel are vertically integrated, so their prices include the full cost of designing and making chips. To get a comparable cost, you'd need to add the costs together for ARM and, say, Qualcomm.
Not that I'm saying your point is wrong; I've no idea what the figures are.
i said it back in september (Score:5, Insightful)
Re: (Score:2, Insightful)
Samsung will be presenting at the ISSCC on their 28nm "big-little".
http://www.eetimes.com/electronics-news/4401645/Samsung-big-little--no-Haswell--Project-Denver-at-ISSCC
>Samsung will detail a 28-nm SoC with two quad-core clusters. One cluster runs at 1. 8 GHz, has a 2 MByte L2 cache and is geared for high performance apps; the other runs at 1.2 GHz and is tuned for energy efficiency.
Need to see how it matches up to Samsung latest 14nm proto.
http://www.eetimes.com/electronics-news/4403838/Samsung-14nm-F
Re: (Score:2)
Re: (Score:2)
14nm in planar transistors? That will leak more than Betty White's knickers.
Welcome the finfet overlords into your life.
Re: (Score:2)
FinFET's are non-planar.
Re: (Score:2)
That is exactly the point. Read it again. Leaking is generally a bad thing.
Re: (Score:2)
That's actually called: big.LITTLE :-)
Would the results be the same under Android ? (Score:5, Interesting)
First, those articles are very interesting, thanks to Intel for making them happen.
Second, it's a good thing that Intel is catching up. I'm not a great Intel fan (rooting for the underdogs and all that), but still, I'm impressed.
Third, isn't the OS choice biasing the results a bit ? Would ARM fare better under a more ARM-oriented OS such as Android ? Or is power consumption profile, in the end, fully OS-independent ?
No (Score:1)
Windows RT still runs a Windows subsystem.
Android's apps are really fragments of apps, the gui is a different fragment from the service (the thing that does any grunt work if needed) etc. If you don't use a gui bit, then that gui bit never loads. If a service bit is running, it's gui bit can/usually is closed.
The broadcast intents mean apps that appear to be running, actually aren't always running or even in memory. The broadcast intent fires (e.g. a minute timer, particular network events, lots of other ev
That's a lot of words, for a simple thing (Score:3, Interesting)
Arm draws 10% of the power of Atom at idle, and Android runs mostly at idle even when you're using it to do stuff because its designed from day one that way. Windows uses a lot more processing power, and 'idle' on those Windows, literally means not using it at all, and even when you're not using it, the Atom is still drawing > 1W.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
android apps are as real apps as windows8rt /windows phone apps - not fragments as such really. this is done for permission isolation and other advantages, like not crashing the entire thing if something goes awry.
you are aware that android apps run as their own user? how about you just go suck it in a ditch.
happy xmas!(rt and wp still blow more than android though!)
Re: (Score:2)
Android's apps are really fragments of apps, the gui is a different fragment from the service (the thing that does any grunt work if needed) etc. If you don't use a gui bit, then that gui bit never loads. If a service bit is running, it's gui bit can/usually is closed.
The broadcast intents mean apps that appear to be running, actually aren't always running or even in memory. The broadcast intent fires (e.g. a minute timer, particular network events, lots of other events...), wakes up the bit of code to handle it, executes, then returns, ending the fragment if necessary.
Apps can be killed at any time, and are designed that way. Hence code is already written to handle it.
Widgets on Android aren't anything, just bitmaps, if the widget changes, it can be because an intent fired, the tiny bit of code needed to redraw the fragment was loaded, executed then discarded. They're not code constantly running.
Apps are memory constrained on Android, on Windows they can grow beyond ram. Which unfortunately means paging to disk or flash. You can see why Android keeps the memory usage of apps down to a minimum given this limit, but paging is no longer a fix if flash is there, writing to flash eats battery.
All of the above is also true for Windows Store apps - which, to remind, is the only thing you can run on RT tablets, except for Explorer, desktop IE and Office. The whole point of that Metro thingy was to come up with not just an UI, but the whole application model that works well on mobile devices - meaning good battery life. To do that, it borrowed a lot of ideas and techniques from iOS and Android, including app lifecycle management.
And yes, it does actually work. My Asus VivoTab RT has battery life jus
Re: (Score:2)
Re: (Score:2)
what is the price difference?
Beyond that... I think the Surface Pro type devices will win the day only because intel is moving to a super efficient design and you might as well get a full windows experience if you can....but again....only if the prices are close.
Chromebook vs Chromebook (Score:2)
Re: (Score:2)
Power consumption certainly does depend quite a bit on the OS, but more so on drivers I think. For example MacBooks run longer on OSX than Windows, but similar spec laptops from other manufacturers match or outperform OSX running Windows. I doubt Apple puts too much effort into their Windows drivers, where as everyone else highly optimizes their system for it.
Windows RT is very new so probably isn't a fair comparison at this point. Maybe a few years down the line when it is more mature a fairer comparison c
technology node (Score:5, Insightful)
So the real question is what do most tablets spend the majority of their time doing? Running a benchmark at full
Re:technology node (Score:5, Insightful)
Also rather hard to hate on Intel for it (Score:5, Informative)
Why are they a node ahead all the time? Because they spend billions in R&D. When the downturn hit everyone in the fab business cut R&D, except Intel. So now they have a 22nm fab that has been running for awhile, another that just came fully online, and two 14nm fabs that'll be done soon (one on 450mm wafers).
They do precisely what geeks harp on companies to do: Invest money in R&D, invest in tech. They also don't outsource production, they own their own fabs and make their own chips. Most of them are even in the United States (8 of the 11).
The payoff is that they are ahead of people in terms of node size, and that their yields tend to be good (because the designers and fab people can work closely).
If other companies don't like it, well the only option is to throw in heavy on the R&D front. In ARM's case being not only fabless but actually chipless, just licensing cores to other companies, they can't do that. They are at the mercy of Samsung, TSMC, Global Foundries, and so on.
Re: (Score:2)
And even though they're way in front technology wise, they keep pissing everybody off with artificial market segmentation. Why?
Re: (Score:3)
They also don't outsource production, they own their own fabs and make their own chips
Outsourcing production is not necessarily a bad thing, as it allows specialisation. Intel can afford it because they are a big player, but for other companies it makes sense to share the fab R&D costs with others, including with their competitors. They then compete based on their strengths (chip design), and the manufacturers compete based on their process technology.
Re: (Score:2)
The second battle to watch is the upcoming server CPU/SOC
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Of course this wasn't an apples for apples comparison, there was no iPad ;-)
Re: (Score:2)
The comparisons that matter are dollars, watts and performance benchmarking. If the processors are a similar price, similar power consumption, but one is a much better performance, you have a winner. Same goes for the other variations.
That one of the competitors is made of magic pixey dust is neither here nor there to the consumer. If Intel have achieved a victory by using a more advanced technology, then more power to them; it's hardly "cheating" the comparison.
Reason: crappy NVidia GPU (Score:2, Insightful)
Example numbers: ARM CPU 0.0038 W vs.. Atom 0.02.
NVidia GPU 0.21 W vs. Imagination 0.11 W
The part that wins isn't from Intel, and it is available for ARM and it probably is the part that would lose badly in any benchmark.
Yay for biased benchmarking.
So far Intel wins by undersizing the GPU.
A tie means Intel loses (Score:5, Insightful)
I have said it before [slashdot.org]: with ARM, you can choose from multiple, competing chip vendors, or you can license the ARM technology yourself and make your own chips if you are big enough; with x86, you would be chaining yourself to Intel and hoping they treat you well. So, if low-power x86 is neck and neck with ARM, that's not good enough.
Intel is used to high margins on CPUs, much higher than ARM chip makers collect. Intel won't want to give up on collecting those high margins. If Intel can get the market hooked on their chips, they will then ratchet up the margins just as high as they think they can.
The companies making mobile products know this, and will not lightly tie themselves to Intel. So long as ARM is viable, Intel is fighting an uphill battle.
Re: (Score:2)
Re: (Score:2)
It shouldn't be an issue in this day and age.
Re: (Score:2)
Re:A tie means Intel loses (Score:5, Insightful)
Actually all those iOS apps already run on Intel, XCode simulator runs Intel code not ARM code. Android also runs on Intel but I believe most apps are emulated during development so they might have slightly more tweaking than an iOS app to get running on intel.
Re: (Score:2)
Smart guy, too bad you can't read.
Re: (Score:2)
Re: (Score:2)
Android NDK already builds for Intel these days - it's been that way for about a year now, I think? Ever since Intel started its mobile push with Medfield, which found its way into a bunch of Android smartphones.
Re: (Score:1)
Native ARM apps can run on Intel Android thanks to libhoudini. Which is actually really good performance-wise - it is probably JITing ARM code to Intel. Unfortunately it is only legally usable on Medfield chips.
http://grokbase.com/p/gg/android-x86/12a35ssv8e/commercial-application-testing [grokbase.com]
Poor comparison (Score:5, Interesting)
Interesting that they are not comparing to a *modern* ARM chip (Cortex-A15), like the Exynos 5 (5250) or even a Qualcom Krait S4 (perhaps MSM8960).
So the news is that Intel has mostly caught up to an old ARM based chip based on designs/specs years older still and only running under MS-Windows. Yawn....
Re: (Score:3)
A15 is much more power efficient than A8 (and A9, which was the one being actually compared). It uses more power, but it provides higher performance per watt.
Comparing two CPUs and saying that one is more power efficient than the other because it uses less power is meaningless, otherwise the old 8086 kicks the new Atom's ass in "power efficiency".
Re: (Score:2)
Sorry to break this to you, but both chips I mentioned are shipping hardware.
I have had the Nexus 10 for several weeks and that is running the Exynos 5. http://en.wikipedia.org/wiki/Nexus_10 [wikipedia.org]
And I have had the Evo LTE for what, six months? And that has the Krait S4 (like over a dozen other major device models out there).
Check under the hood (Score:2)
Intel GPUs more open prospect than ARM (Score:5, Insightful)
One area in which Intel is significantly more open than any manufacturer in the ARM ecosystem is in graphics hardware. Although Intel hasn't opened all their GPUs fully yet (from what I've read), this seems to be mostly because providing all the documentation takes time, not because they are against making everything open.
This contrasts dramatically with every single ARM license in existence. ARM's own MALI GPU is tightly closed (probably because MALI was a licensed technology) so the Lima team is having to reverse engineer a Linux driver. All the ARM licensees who provide GPUs seem to be either unable to open their GPU information because their GPU core has been licensed from a 3rd party, or else are simply disinterested in doing so, or else vehemently opposed to it for alleged commercial reasons in at least a couple of cases. So, the prospect of open documentation on SoC GPUs appearing from ARM manufacturers is vanishingly small.
This gives Intel at least one possible opening through which they can be fairly certain that the competition will not follow. Although that may be worth a lot to us in this community, the commercial payback from community support tends to be very slow in coming. Still, it's something that Intel might consider an advantage worth seizing in the mobile race where they're a rank outsider.
Re: (Score:2)
You either don't know what you're talking about or just plainly trolling. The GPU specs [intellinuxgraphics.org] that Intel opened is the CoreHD graphics series which is Intel's own GPU technology and is in no way related to ImgTech's PowerVR.
I am looking forward though to the real competition between ARM's latest and greatest with Intel's upcoming Haswell.
Re: (Score:2)
Intel don't use their own graphics tech in these Atoms. Instead they license it from ImgTec (PowerVR).
Re: (Score:1)
Intel don't use their own graphics tech in these Atoms. Instead they license it from ImgTec (PowerVR).
Not for long [phoronix.com]
Re: (Score:1)
I can't wait for an Atom that pairs the new Atari Dumbledore CPU core and the new Porkslope Turkeyhandle GPU.
Apples and Oranges sometimes (Score:3)
So I thing this is a case where you really have to look at the significantly broken down performance results to see if your use case fits one chip better than the other. A normal consumer example would be if your OS is encrypting your file system and using these cool Intel instructions. I suspect that it would then be a night and day difference in battery drain. But the drag is that you probably have to pretty well buy a device with both chips, set up your standard configuration, and then test it out. This is generally only something an IT person about to provision a department might be expected to do.
I guess that the overall benchmark is all we really have to go by which really doesn't tell the whole story.
Re:Apples and Oranges sometimes (Score:4, Interesting)
One thing to keep in mind is that the ARM is much more general purpose while the Intel chips tend to have a more complex assembly instruction set. So for adding one number to another (x=y+z) I suspect the simpler ARM architecture is going to win on power consumption. But many Intel chips have assembly instructions specifically for crazy things like AES encryption. This is used as the basis of many encryption protocols, hashing, and random number generation. So if a machine is basically serving up all encrypted data then it is possible that an Intel chip will be much faster and consume much less power while performing these operations.
Not really important. The Intel chips convert assembly instruction into microcode - how they implement it internally (either dedicated hardware or reusing existing silicon) is up to them. You can't make a blanket statement like that unless Intel has specifically stated that hardware support is included. But in general, the Atom series trims as much off the CPU core as possible so don't be surprised if hardware support for some of those exotic instructions is lacking. And many ARM cores include instructions that are just as interesting - mostly for the embedded DSP market. A manufacturer, with the appropriate license, can include whatever instructions and dedicated hardware they want.
What likely matters more then the instructions is the included memory and cache. Intel likely includes a larger cache - which will drive up the price. Cache is usually static and has a very low power draw when not in use. By including a large cache, Intel can minimize expensive requests to memory. Also note that DIMMs have a significant constant current draw. Low power DIMMs are available but more expensive. You can bet that Intel used the latest and greatest for their demo while others might opt for the cheaper and slightly more power hungry DIMMs.
This demo shows how having a process 1 step more advanced then the competition can make a big difference wrt power consumption. But newer ARMs will be available soon - I believe Samsung is scheduled for roll out 28nm in the near future. Intel still has a long way to go to convince manufacturers that they should pay more for what ARM can do for less.
DIMMs? (Score:2)
You must mean RAM chips and even those are often on-chip on these SoC systems. The main thing here is price point and since Intel is the only manufacturer and uses a very expensive fab at 32nm, their system is far more expensive to buy than a "generic" 40nm fab Arm chip. You are right that the Intel is, under the hood, just as RISC as the Arm chip is. The point seems to be that with using a more expensive smaller fab, Intel can sort-of offset the extra power required for the on-the-fly translation of x86 in
Re: (Score:2)
"You must mean RAM chips and even those are often on-chip on these SoC systems."
Nope. A DRAM of any significant capacity (256MB or better) has a similar die size to a SoC chip. An SoC will usually have some RAM on-board for buffers, cache, maybe low-end graphics support but the main memory in tablets, phones etc. resides on separate DRAM chips. A typical 2Gb DDR3 die is about 30 sq. mm whereas the Tegra 3 with 5 cores and over a MB of cache is 80 sq. mm.
Devices like the Raspberry Pi uses package-on-pack
Re: (Score:2)
except Intel also will be releasing 14nm in 2013. sorry ARM, you lose.
Re: (Score:2)
But many Intel chips have assembly instructions specifically for crazy things like AES encryption.
You picked a pretty poor example, as ARMv8 includes instructions for AES. You should also look at the NEON instruction set on ARM, which has a number of fairly complex floating point operations. The advantage of the microcode on an x86 chip is greater instruction density, meaning less instruction cache usage, so you can have less instruction cache, which means less power consumption. The disadvantage is that you have a significantly more complex instruction decoder, which means more power consumption. T
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Intel chips are nothing more than dressed up RISC processors. The high level CISC instructions are converted into RISC micro-ops before execution. Similarly, no one in their right mind would call ARMv7/ARMv8 "reduced"
Comparing two Windows tablets (Score:2, Funny)
Re: (Score:2)
Yup. That same shootout with Android would be way more interesting.
Biased review ! (Score:1)
They are comparing old ARM vs new Intel !
Duh? (Score:1)
Tom's Hardware was bought outright by Intel years ago, and has since written only glowing reviews of trending Intel products. What else did you expect to come out of that now-defunct propaganda machine?
Embraced, extended... (Score:2)
Re:Are either of these processor relevant? (Score:5, Informative)
http://www.tomshardware.com/reviews/snapdragon-s4-pro-apq8064-msm8960t,3291-4.html [tomshardware.com]
Atom isn't here, but perhaps because it is too new, but it's clear from this graph that at least Tom's Hardware seems to agree that the Snapdragon eats Tegra's lunch.
I have a Nexus 4 (Snapdragon S4) and a Nexus 7 (Tegra 3), and the 4 is WAY, WAY faster than the 7 in almost every experience.
On the Nexus 4 I can leave a movie playing in the background and keep listening to it while I check an important email that just came in or make a move in a game of Words with my wife. Attempting the exact same thing on the Nexus 7 results in the movie skipping and the user experience slowing to a crawl.
Perhaps there are some significant architecture differences between the two, but at least from a real-world user experience standpoint, I would not characterize the OP's assertion as "random conjecture" at all.
Re: (Score:3)
That's probably a combination of the piss-poor GPU on Tegra 3 (barely good enough to render one thing at-a-time, and you expect stutter-free multitasking?) Along with the pathetic memory bandwidth (DDR3, but only a 32-bit bus).
Snapdragon S4 has nether of these issues!
Re: (Score:2)
The Tegra GPU eats the S4 GPU. Don't make uninformed claims. The difference is that the Tegra, like it's predecessor, needs GPU code to turn it on because it's such a massive generator of heat that all cores except one are disabled during normal use (it has 16 Unified Shaded cores iirc). No benchmark has currently paid Nvidia the fees to get the Nvidia SDK to make a Tegra benchmark so far. I doubt it's cheap.
Now go find a Tegra HD game and gawp.
Re:I wish the would concentrate on giving more spe (Score:4, Interesting)
I dont give a flying crap how much juice it sucks just give me 75 gigahertz CPU and a damn drive that can keep up.
Oh and make it AMD prices not intel.
I suspect you're in the minority here (as in wanting power regardless of power consumption). For me, desktop (and laptop) processors became fast enough about 5 years ago. Probably more. The laptop i'm using now is about 5 years old and any performance problems it has aren't CPU related. A hard drive that can keep up with my 1.8Ghz CPU would be nice - something that could keep your proposed 75GHz running without waiting would be just a little awesome :)
Re: (Score:1)
- that die area is wasted on circuits translating x86 to internal RISC machine
If you look at a die photo most of the area is cache - the actual CPU is a small fraction of that. And the translation circuits are a small fraction of that.