Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

NVIDIA Unveils Lineup of GeForce 800M Series Mobile GPUs, Many With Maxwell

samzenpus posted about 8 months ago | from the brand-new dept.

Hardware 83

MojoKid writes "The power efficiency of NVIDA's Maxwell architecture make it ideal for mobile applications, so today's announcement by NVIDIA of a new top-to-bottom line-up of mobile GPUs—most of them featuring the Maxwell architecture—should come as no surprise. Though a couple of Kepler and even Fermi-based GPUs still exist in NVIDIA's new line-up, the heart of the product stack leverages Maxwell. The entry-level parts in the GeForce 800M series consist of the GeForce GT 820M, 830M, and 840M. The 820M is a Fermi-based GPU, but the 830M and 840M are new chips that leverage Maxwell. The meat of the GeForce GTX 800M series consist of Kepler-based GPUs, though Maxwell is employed in the more mainstream parts. NVIDIA is claiming the GeForce GTX 880M will be fastest mobile GPU available, but the entire GTX line-up will offer significantly higher performance then any integrated graphics solution. The GeForce GTX 860M and 850M are essentially identical to the desktop GeForce GTX 750 Ti, save for different frequencies and memory configurations. There are a number of notebooks featuring NVIDIA's GeForce 800M series GPUs coming down the pipeline from companies like Alienware, Asus, Gigabyte, Lenovo, MSI and Razer, though others are sure the follow suit. Some of the machines will be available immediately."

Sorry! There are no comments related to the filter you selected.

W0000000T1!!!!! (0)

Anonymous Coward | about 8 months ago | (#46469379)

Just in time for Titanfall

Re:W0000000T1!!!!! (0)

Anonymous Coward | about 8 months ago | (#46470475)

Aww man. Right after I buy a laptop with a 770M.

Altcoin mining (-1)

Anonymous Coward | about 8 months ago | (#46469417)

I've been using the GTX 750 Ti for scrypt-based mining. ~280KH/s out of the box at ~60watts is fantastic.

How to Falsify Evolution (-1)

Anonymous Coward | about 8 months ago | (#46469463)

Any theory that does not provide a method to falsify and validate its claims is a useless theory.

Example; if someone said a watermelon is blue on the inside, but turns red when you cut it open, how could you prove them wrong? How could they prove they're right?

You couldn't and they can't. There is no method available to confirm or disprove what was said about the watermelon. Therefore we can dismiss the theory of the blue interior of watermelons as being pure speculation and guess work, not science. You can not say something is true without demonstrating how it is not false, and you can not say something is not true without demonstrating how it is false. Any theory that can not explain how to both validate and falsify its claims in this manner can not be taken seriously. If one could demonstrate clearly that the watermelon appears to indeed be blue inside, without being able to demonstrate what colors it is not, we still have no absolute confirmation of its color. That is to say asserting something is the way it is, without being able to assert what it is not, is a useless claim. Therefore, in order for any theory to be confirmed to be true, it must be shown how to both validate and falsify its claims. It is circular reasoning to be able to validate something, without saying how to falsify it, or vice versa. This is the nature of verification and falsification. Both must be clearly demonstrated in order for a theory to be confirmed to be true or false. Something can not be proven to be true without showing that it is not false, and something can not be proven to be not true, unless it can be proven to be false.

Unfortunately, Darwin never properly demonstrated how to falsify his theory, which means evolution has not properly been proven, since it has never been demonstrated what the evidence does not suggest. In the event that evolution is not true, there should be a clear and defined method of reasoning to prove such by demonstrating through evidence that one could not possibly make any alternative conclussions based on said evidence. It is for this reason we must be extremely skeptical of how the evidence has been used to support evolution for lack of proper method of falsification, especially when the actual evidence directly contradicts the theory. If it can be demonstrated how to properly falsify evolution, regardless if evolution is true or not, only then can evolution ever be proven or disproved.

It will now be demonstrated that Darwin never told us how to properly falsify evolution, which will also show why no one can claim to have disproved or proven the theory, until now. It must be able to be demonstrated that if evolution were false, how to go about proving that, and while Darwin indeed made a few statements on this issue, his statements were not adequate or honest. In order to show Darwin's own falsification ideas are inadequate, rather than discussing them and disproving them individually, all that needs to be done is demonstrate a proper falsification argument for evolution theory. That is to say if the following falsification is valid, and can not show evolution to be false, then evolution theory would be proven true by way of deductive reasoning. That is the essence of falsification; if it can be shown that something is not false, it must therefore be true.

So the following falsification method must be the perfect counter to Darwin's validation method, and would therefore prove evolution to be true in the event this falsification method can not show evolution to be false. As said before; if something is not false, it must therefore be true. This would confirm the accuracy of this falsification method, which all theories must have, and show that Darwin did not properly show how evolution could be falsified, in the event that evolution was not true. In order to show evolution is not false (thereby proving it to be true), we must be able to show how it would be false, if it were. Without being able to falsify evolution in this manner, you can not validate it either. If something can not be shown to be false, yet it is said to be true, this is circular reasoning, since you have no way of confirming this conclusion. Example; If we told a blind person our car is red, and they agreed we were telling the truth, the blind person could not tell another blind person accurate information regarding the true color of the car. While he has evidence that the car is red by way of personal testimony, he has no way of confirming if this is true or false, since he might have been lied to, regardless if he was or not.

So one must demonstrate a method to prove beyond any doubt that in the event that evolution is not true, it can be shown to be such. To say evolution is true, without a way to show it is false, means evolution has never been proven to be true. If evolution be true, and this method of falsification be valid, then by demonstrating the falsification method to be unable to disprove evolution, we would confirm evolution to be right. Alternatively, if the falsification method is valid and demonstrates that Darwin's validation method does not prove evolution, then evolution is false indeed.

Firstly, the hypothesis. If evolution is incorrect, then it can be demonstrated to be so by using both living and dead plants and animals. The following is the way to do so and the logical alternative to the theory. The fossil record can be used as well, but not as evolution theory would have us believe. In order to properly falsify something, all biases must be removed, since assuming something is correct without knowing how to prove its false is akin to the blind person who can not confirm the color of someones car. Since evolution has not correctly been shown how to be falsified, as will be demonstrated, we must be open to other possibilities by way of logic, and ultimately reject evolution by way of evidence, should the evidence lead us in such a direction.

If evolution be not true, the only explanation for the appearance of varied life on the planet is intelligent design. This would predict that all life since the initial creation has been in a state of entropy since their initial creation, which is the opposite of evolution. If this be true, then animals and plants are not increasing in genetic complexity or new traits as evolution theory would have us believe, but are in fact losing information. This would explain why humans no longer have room for their wisdom teeth and why the human appendix is decreasing in functionality. The only objection to this claim that evolution theory would propose is that evolution does not always increase the genetic complexity and traits of an organism, but rather, sometimes decreases them as well. This objection is only made because we have only ever actually observed entropy in living creatures, which suits the creation model far better than evolution, which shall be demonstrated.

If the creation model is true, we can make verifiable predictions that disprove evolution. For example; the creation model states that life was created diversified to begin with, with distinct "kinds" of animals, by a supernatural Creator that did not evolve Himself, but rather always existed. Without going into the debate on how such a being is possible to exist, it must be said that either everything came from nothing, or something always existed. To those who say the universe always existed; the claim of this hypothesis is that the Creator always existed, which is equally as viable for the previous logic.

In order to demonstrate that the Creator is responsible for life and created life diversified to begin with, the word "kind" must be defined. A kind is the original prototype of any ancestral line; that is to say if God created two lions, and two cheetahs, these are distinct kinds. In this scenario, these two cats do not share a common ancestor, as they were created separately, and therefore are not the same kind despite similar appearance and design. If this is the case, evolution theory is guilty of using homogeneous structures as evidence of common ancestry, and then using homogeneous structures to prove common ancestry; this is circular reasoning!

The idea of kinds is in direct contrast to evolution theory which says all cats share a common ancestor, which the creation model does not hold to be true. If evolution theory is true, the word kind is a superficial label that does not exist, because beyond our classifications, there would be no clear identifiable division among animals or plants, since all plants and animals would therefore share a common ancestor. The word kind can only be applied in the context of the creation model, but can not be dismissed as impossible due to the evolutionary bias, simply because evolution has not been properly validated nor can it be held to be true until it can correctly be shown to be impossible to falsify.

One must look at the evidence without bias and conclude based on contemporary evidence (not speculation) if indeed evolution is the cause of the diversity of species, or not. It must also been demonstrated if the clear and distinct species do or do not share a common ancestor with each other, regardless that they may appear to be of the same family or design. In order to verify this, all that needs to be done is to demonstrate that a lion and cheetah do or do not have a common ancestor; if it can be demonstrated that any animal or plant within a family (cats in this case) do not share a common ancestor with each other, this would disprove evolution immediately and prove supernatural creation of kinds.

However, since lions and cheetahs are both clearly of the same family or design, and can potentially interbreed, we must be careful not to overlook the possibility of a very recent common ancestor If such is the case, this does not exclude the possibility that the two are originally from two separate kinds that do not share a common ancestor previous to them having one. It is therefore necessary to build an ancestral history based on verifiable evidence (not homogeneous structures in the fossil record) that can clearly demonstrate where exactly the cheetah and the lion had a common ancestor. If no such common ancestor can be found and confirmed without bias, and this test is performed between two or more of any plant or animal life without ever finding anything to the contrary, we can confirm with certainty evolution did not happen, and that kinds do exist.

In the event that fossils are too elusive (compounded with the fact that they can not be used as evidence of common descent due to circular reasoning e.g. homogeneous structures), then there is a superior and far more effective way to falsify evolution. Evolution states by addition of new traits (new organs, new anatomy) that the first lifeforms increased in complexity and size by introduction of new traits, slowly increasing step by step to more complex life forms. Notice that the addition of such traits can not be attributed to the alteration of old ones, for obvious reasons, since detrimental or beneficial mutations are only alterations of already existing traits, and can not account for an increase in the number of traits any given life form possesses.

That means a bacteria becoming able to digest nylon is a mere mutation of already existing digestive capabilities, and can not be classified as an increase in traits. Evolution theory would predict that the process of gradual change and increase in traits is an ongoing process, and therefore should be observable in todays living animals and plants through new emerging traits that any given plant or animal did not possess in its ancestry. Those who say such changes take millions of years and can not be observed today only say so because no such trait has ever been observed to emerge or be in the process of emerging in contemporary history, which is what the creation model predicts. If evolution theory be true, we would expect that at least one animal or plant would contain a new trait or be in the process of growing such a triat over its known common ancestors (that is not simply a multiplication or alteration of a trait it already had).

At this point, the fossil record can not be used as evidence to prove that evolution can produce new traits due to the fact that two animals that appear to be of the same family (T-rex and Brontosaurus, dinosaurs), while they do indeed exhibit distinct trait differences, may not have a common ancestor, but rather were created differently with all their different traits. It is therefore of paramount importance to show a single instance of such an increase of traits exists within a provable ancestry (stress provable) in contemporary times, and not assume anything concerning where the traits in the fossil record owe their origin. If it can not be shown that any animal or plant living today (or very recently deceased) exhibits any trait variance that can clearly and thoroughly be proven to be a new addition over its (stress) provable ancestors, compounded with the reasoning that two similar animals (such as a penguin and a woodpecker) do not necessarily or provably share a common ancestor, then evolution is clearly absent entirely, and supernatural intelligent design and creation is thereby proven beyond all reasonable doubt.

In conclusion, should any two animals or plants within a family (a palm tree and a coconut tree) be proven to not share a common ancestor, or if no provable increase of traits can be demonstrated to be in its beginnings or actively present in the animals and plants living today over their provable ancestry, then The Bible is correct when it says God created all the animals and plants as distinct kinds with their traits to begin with. This is the only way to falsify evolution, and it is amazing (and convenient) that Darwin never encouraged people to attempt to falsify his theory in this manner.

Re:How to Falsify Evolution (1)

viperidaenz (2515578) | about 8 months ago | (#46469543)

I invented the colour Orange. Prove me wrong.

Re:How to Falsify Evolution (0)

Anonymous Coward | about 8 months ago | (#46469571)

I have the color ( and colour ) orange patented, so actually its up to you to prove that you invented it first.

Re:How to Falsify Evolution (0)

Anonymous Coward | about 8 months ago | (#46469587)

I invented the colour Orange. Prove me wrong.

Can I blame you for the overpriced mobile phone service company by the same name?

Re:How to Falsify Evolution (1)

sexconker (1179573) | about 8 months ago | (#46469617)

I invented the colour Orange. Prove me wrong.

The color "orange" used to be called red-yellow. The color is named after the fruit.
The fact that you claim to have invented the color yet refer to it by its adopted name instead of the original proves that you're a liar who has nothing to do with the color's invention/naming/use.

Re:How to Falsify Evolution (1)

viperidaenz (2515578) | about 8 months ago | (#46469829)

I refer to it by the name most recognised in today's society.

Re: How to Falsify Evolution (1)

Zero__Kelvin (151819) | about 8 months ago | (#46472769)

The color Orange is defined by a frequency range. One can allocate and name a frequency range, but never invent it. Since said color is the vibration of reflected light within that range, and light existed well before you did, you could not possibly have invented it.

Re: How to Falsify Evolution (1)

viperidaenz (2515578) | about 8 months ago | (#46475491)

Light never existed in the specific wavelength of orange until I commanded it.
There is no evidence to suggest otherwise so it must be true.

Re: How to Falsify Evolution (1)

Zero__Kelvin (151819) | about 8 months ago | (#46475729)

Dear Moron,
You cannot invent that which pre-exists you ..

Re: How to Falsify Evolution (1)

viperidaenz (2515578) | about 8 months ago | (#46476649)

Where is your evidence that it existed before me?
You have no evidence that light of that wavelength existed yesterday.

Re: How to Falsify Evolution (0)

Zero__Kelvin (151819) | about 8 months ago | (#46484107)

Shut the fuck up you ignorant moron.

Re:How to Falsify Evolution (0, Troll)

benjfowler (239527) | about 8 months ago | (#46469555)

Slashdot needs a "-1 Stupid" moderation.

Re:How to Falsify Evolution (0)

Anonymous Coward | about 8 months ago | (#46470995)

Well, this time you only get "-1 Troll".

*cheerful Santa Claus belly laugh*

Re:How to Falsify Evolution (0)

MrL0G1C (867445) | about 8 months ago | (#46469615)

To prove evolution false, look for but never find any proof of genetic mutations.

Oops, we found them, many times. Antibiotics resistant mutations of bacteria that weren't resistant before is probably the best example of evolution and Darwinism.

Re:How to Falsify Evolution (1)

noh8rz10 (2716597) | about 8 months ago | (#46469653)

tl, i actually skimmed it a bit. the only thing I could grab on to is this:

If evolution be not true, the only explanation for the appearance of varied life on the planet is intelligent design.

This is a logical leap, and creates a flaw in the remainder of the post. If the only two choices are evolution or ID, then an argument against evolution is an argument for ID (which is what the rest of the words are about I think). but why can't there be other potential theories? I'm sure the world has thought of hundreds.

whatevs, not a good use of time.

Re:How to Falsify Evolution (1)

HornWumpus (783565) | about 8 months ago | (#46469809)

Don't encourage the bastard.

Re:How to Falsify Evolution (0)

Anonymous Coward | about 8 months ago | (#46470119)

You, AC, are so fucking retarded. You are a disgrace to YHWH.

Re:How to Falsify Evolution (0)

Anonymous Coward | about 8 months ago | (#46470617)

But be this comment not true then retardedness as it is unproven must therefore be a lie, as be it a truthful statement a lie it cannot be. As be it not true therefore and thus it has been shown that retarded this AC probably is not, because it is not proven as it will be shown.

Thus we have shown that retarded AC that speaks in therefores and be it trues is retarded. And can't write for shit.

Worse than me even.

Re:How to Falsify Evolution (2)

ArcadeMan (2766669) | about 8 months ago | (#46470783)

TL;DR

Leverage? (0)

Anonymous Coward | about 8 months ago | (#46469491)

This marketing speak is poisoning our language.

WRONG- there is but ONE Maxwell chip (-1)

Anonymous Coward | about 8 months ago | (#46469497)

More dribble from Slashdot. Nvidia builds exactly ONE Maxwell chip, which is currently given FOUR different names for its 2 uses in mobile, and two uses in desktop graphics.

Nvidia has still FAILED to explain why, if Maxwell is so wonderful at the ONE thing it excels at (using little power compared to other solutions in the same performance range), it didn't sell the entire first batch of Maxwell parts into the far more lucrative mobile market. Launching Maxwell first on the desktop, where the part is a god-awful joke (vastly more expensive than AMD's 265, and massively slower in taxing AAA games), made no sense whatsoever.

Maxwell for the desktop needed to be at least 50% more powerful (bigger die, with more graphics capability). On mid-end gaming notebooks, the chip MAY prove quite nice if its power efficiency scales to mobile power constraints.

All the other parts are but re-badges- a commonplace FRAUD in the industry, sadly, designed to con less knowledgeable consumers when they go shopping for a new laptop. The ONLY good thing about these dishonest practices is that laptops built with the EXACT SAME chips, but labelled with the old number, tend to see rapid discounting.

Industry watchers are completely mystified by Nvidia (and AMD) because it now seems certain that the process shrink to 20nm long awaited for new GPU chips is not going to happen any time soon (and certainly NOT in 2014), meaning that both AMD and Nvidia need a new round of parts, top to bottom, in the current 28nm process. Nvidia currently only has this one Maxwell, poorly positioned. AMD has the 260 and 290 (both with the as yet unused Trueaudio DSP), BUT the AMD designs are now so old they are way too power inefficient.

One must assume new 28nm Maxwell chips are on their way, and equivalent competing parts from AMD, but the growing weakness of the PC market sems to make both companies reluctant to invest in new parts when the old ones are still selling OK.

Re:WRONG- there is but ONE Maxwell chip (1)

Smauler (915644) | about 8 months ago | (#46470317)

I don't care about the architecture, how much RAM they have, how many pipelines they have, with graphics cards. Seriously, I don't know enough for it to be relevant to me. All I want to know is how fast it is (playing games), and how much it costs. Those are the only 2 things that are relevant to me. I don't care what die it was shipped on, I don't care about anything but price/performance. Some might care about power usage... I don't (within reasonable limits).

Re:WRONG- there is but ONE Maxwell chip (1)

Blaskowicz (634489) | about 8 months ago | (#46472349)

Not that you seem to care, but nvidia is precisely launching a SECOND Maxwell chip with that laptop announcement, first was GM107 in desktop GTX 750 and Ti, now in 860M, 850M : it has five "SMM" and a 128bit bus. Second is GM108 in Geforce 830M and 840M, a smaller GPU with less SMM on 64bit bus. With DDR3 memory. That gives low performance, but it's clearly a low power low budget part.

Maxwell... (1)

jtownatpunk.net (245670) | about 8 months ago | (#46469577)

That's the second biggest GPU I've ever seen.

Re:Maxwell... (0)

Anonymous Coward | about 8 months ago | (#46470853)

That's what she said.

Re:Maxwell... (0)

Anonymous Coward | about 8 months ago | (#46471623)

I never heard her say anything about 'second'.

Re:Maxwell... (1)

PIBM (588930) | about 8 months ago | (#46473103)

You should stop playing with it then!

Re:Maxwell... (0)

Anonymous Coward | about 8 months ago | (#46471717)

Fuck Nvidia

------------ Linus Torvalds

Re:Maxwell... (0)

Anonymous Coward | about 8 months ago | (#46478879)

I hold ATI/AMD with much more contempt due to decades worth of shitty, artefacting, crashing drivers. At least Nvidia works.

seperate mobile GPU's is declining market (3, Interesting)

bloodhawk (813939) | about 8 months ago | (#46469585)

I thought I would always want discrete graphics. But nowadays the majority of laptops really have no need of it, The AMD and Intel integrated offerings while not amazing are more than adequate for the vast majority of purposes. my latest 2 laptops both use integrated Intel 4th gen and handle laptop needs completely for both my work and the limited gaming I do on a laptop. I would imagine Nvidia are very uncomfortable with the way their market has been contracting over the last couple of years.

Linux (-1)

Anonymous Coward | about 8 months ago | (#46469769)

Your statement is true for a Windows world only. AMD Linux support is piece of shit; Intel is just "usable" leaving us with nvidia option only.

Re:Linux (1)

loosescrews (1916996) | about 8 months ago | (#46470351)

Have you tried a recent kernel? Intel's linux suport has been greatly improving and is now quite good.

Re:Linux (1)

jones_supa (887896) | about 8 months ago | (#46471005)

Now? Intel GPU support has been excellent under Linux even back when the crusty GMA chips were all we had.

Re:Linux (1)

Jamie Lokier (104820) | about 8 months ago | (#46471703)

Now? Intel GPU support has been excellent under Linux even back when the crusty GMA chips were all we had.

Except for the bugs. I used Linux, including tracking the latest kernels, for over 6 years with my last laptop having an Intel 915GM.

Every version of the kernel during that time rendered occasional display glitches of one sort or another, such as a line or spray of random pixels every few weeks. Rare but not bug free.

And that's just using a terminal window. It couldn't even blit or render text with 100% reliability...

I investigated one of those bugs and it was a genuine bug in the kernel's tracking of cache flushes and command queuing.
In the process I found more bugs than I cared to count in the modesetting code.

Considering the number of people working on the Intel drivers and the time span (6 years) that was really surprising, but that's how it was.

Re:Linux (1)

Eunuchswear (210685) | about 8 months ago | (#46472201)


        Section "Device"
                Identifier "Intel"
                Driver "intel"
                Option "DebugWait" "true"
        EndSection

Re:Linux (1)

Jamie Lokier (104820) | about 8 months ago | (#46472639)

Thanks! But too late. That machine died this time last year, after 6 years of excellent service. I moved on to new hardware.

Hopefully the xorg.conf is useful to someone else.

I've just looked up what people are saying about DebugWait, and I see the font corruption - that's just one of the types of corruption I saw!
But perhaps that was the only kind left by the time my laptop died.

Just a note to others, that DebugWait doesn't fix the font corruption for everyone according to reports. But, it's reported as fixed by the time of the kernel in Ubuntu 13.04 according to https://bugs.launchpad.net/ubu... [launchpad.net]

I stand by my view that Intel GPU support never quite reached "excellent" because of various long term glitches, although I'd give it a "pretty good" and still recommend Intel GPUs (as long as you don't get the PowerVR ones - very annoying that was, that surprise wrecked a job I was on). Judging by the immense number of kernel patches consistently over years, it has received a lot of support, and in most ways worked well.

Getting slightly back on topic with nVidia: Another laptop I've used has an nVidia GPU, and that's been much, much worse under Ubuntu throughout its life, than the laptop with Intel GPU. Some people say nVidia's good for them with Linux, but not this laptop. Have tried all available drivers, Nouveau, nVidia, nVidia's newer versions etc. Nothing works well, Unity3d always renders ("chugs") about 2-3 frames per second when it animates anything, which is barely usable, the GPU temperature gets very hot when it does the slightest things, and visiting any WebGL page in Firefox instantly crashes X with a segmentation fault due to a bug in OpenGL somewhere, requiring a power cycle to recover properly. So I'd still rate nVidia poorer than Intel in my personal experience of Linux on laptops :)

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46469791)

Surely you jest. The next step are retina like monitors. Find discrete graphics to deliver performance on those beasts.

Re:seperate mobile GPU's is declining market (1)

epyT-R (613989) | about 8 months ago | (#46470001)

Even multi configurations with top model GPUs don't do well with 3840x2400 resolutions.

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46471191)

surely you jest. There are dozens if not hundreds of monitors with retina like or higher quality laptop screens with integrated graphics. Retina is only above average at best now. Even my now relatively old Samsung series 9 with integrated graphics runs at higher than res than retina.

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46469867)

I thought I would always want discrete graphics.

Never bet against integration.

I would imagine Nvidia are very uncomfortable

For the time being there is lots of money to make in GPUs and NVidia is doing fine. Ultimately, however, the whole company is one big bet against integration.

Re:seperate mobile GPU's is declining market (2)

epyT-R (613989) | about 8 months ago | (#46469981)

Those of us doing more with computers than editing text documents and refreshing facebook still need descrete GPUs.

Re:seperate mobile GPU's is declining market (1)

Kjella (173770) | about 8 months ago | (#46470233)

According to the latest market statistics 66% of PCs overall use embedded graphics. Even Steam has a 16% Intel share and probably some AMD APUs that aren't separated out. I don't know about you but anything "serious" I do like work doesn't push the GPU one bit, the only thing that does is gaming. And not everybody is a gamer or their idea of gaming is more like Candy Crush. On that note, I loved The Walking Dead, here's the system requirements:

Windows Operating system: Windows XP / Vista / Windows 7
Processor: 2.0 GHz Pentium 4 or equivalent
Memory: 3 GB RAM
Video Card: ATI or NVidia card w/ 512 MB RAM
Direct X 9.0c
Audio card required

Oh so that's like any CPU and graphics card made in the last 10 years or so. What about something like Civilization V (okay it's a bit old but there's no Civ6 yet)

Operating System: Windows XP SP3/ Windows Vista SP2/ Windows 7
Processor: Dual Core CPU
Memory: 2GB RAM
Hard Disk Space: 8 GB Free
DVD-ROM Drive: Required for disc-based installation
Video: 256 MB ATI HD2600 XT or better, 256 MB nVidia 7900 GS or better, or Core i3 or better integrated graphics
Sound: DirectX 9.0c-compatible sound card
DirectX: DirectX version 9.0c

Scary requirements yeah? What about World of Warcraft, that's some million gamers:

Windows XP / Windows Vista / Windows® 7 / Windows 8 / Windows 8.1 with the latest service pack
Intel Pentium D or AMD Athlon 64 X2
NVIDIA GeForce 6800 or
ATI Radeon X1600 Pro (256 MB)
2 GB RAM (1 GB Windows XP)

I could go on, but long story short unless you're into the latest and greatest 3D games no it's not really required. Sure I need a discrete graphics card, but I know I'm in the minority. And I just need it to run Skyrim and stuff like that, I don't need the worst SLI/CF setup for twitcher FPS games either.

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46470559)

A lot of those requirements are somewhat BS, I just got out my old crappy Dell w/ Win7 on it and a 256MB 7900GS (4GB ram, Core2 duo) and installed Civ 5.

It crashed at anything WXGA or higher res on medium settings, lowest settings I got maybe 10FPS and at 800x600 (with horrible scaling and an incorrect aspect ratio, along with incorrect UI bugs associated with the incorrect aspect ratio) I managed maybe 20FPS - while absolutely destroying my GPU (you wouldn't want this thing on your lap, that's for sure).

If you consider gaming running on lowest settings with headache inducing stuttering, then sure - have fun. Most people want 30Hz and some sort of visual representation that aligns to what the devs intended you to see (s.t. you don't hit obscure bugs around UI not aligning to what you're seeing!)

Requirements might be for XP (0)

Anonymous Coward | about 8 months ago | (#46471481)

> Win7 on it and a 256MB 7900GS

I asume that means Aero is enabled. That will use up a huge percentage of video RAM.
It's likely both on XP (for which I assume the specs were written originally) and Win7 with Aero off will work much, much better.

Re:Requirements might be for XP (0)

Anonymous Coward | about 8 months ago | (#46471559)

Aero doesn't run when you play games full screen.

Re:Requirements might be for XP (0)

Anonymous Coward | about 8 months ago | (#46471629)

the desktop is not utilising GPU while a game is running so whether you are using aero or not is completely irrelevant to performance

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46474401)

HD4000 handles civ 5 with no issues on an i7 notebook with 8GB RAM (I play it once a week at least)

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46474389)

Even Skyrim runs a treat on intel's HD4000, tbh I haven't switched on the GPUs in my laptop for anything except FF14, Saints Row, and BF3/4 since I bought it! LoL, Diablo 3, Starcraft all the Source Engine games, Xcom, Naruto, and PS2 Emulation all run near or at 60 FPS @ native res on just the integrated i7 chip.

I have been blown away with the performance of intel's little chip, this advance is even more staggering when I think about the GMA chips not even being able to run WoW 9 years ago.

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46479157)

Skyrim runs like shit on an Intel HD 4600, which is significantly faster than the HD 4000. The game has to be set to the lowest detail and still only manages about 30 FPS.

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46470361)

Those of us doing more with computers than editing text documents and refreshing facebook still need descrete GPUs.

And spelling checks

Re:seperate mobile GPU's is declining market (1)

bloodhawk (813939) | about 8 months ago | (#46470469)

I only do software development, some gaming and photo editing on my laptop, yes I am more than aware their are a lot of areas that still need discrete graphics for laptops, but it is a rapidly shrinking market. Even desktops now the integrated option is taking an increasingly large share.

What are you doing - pray tell... (1)

frnic (98517) | about 8 months ago | (#46470529)

Other than professionally modeling or doing video editing, or playing #D games - what use does a average person have for discrete graphics today?

Re:What are you doing - pray tell... (1)

CadentOrange (2429626) | about 8 months ago | (#46471553)

Other than professionally modeling...

Look, this is Slashdot. You're not going to find professional models here.

Re:What are you doing - pray tell... (0)

Anonymous Coward | about 8 months ago | (#46471585)

none

Re:seperate mobile GPU's is declining market (0)

Anonymous Coward | about 8 months ago | (#46471061)

You are about a decade out of date. The integrated graphics now are more than adequate for anything except top end gaming and some design/video editing work. The average person has Zero need of discrete graphics in a laptop in the vast majority of use cases. Use cases that require discrete graphics I would say is down to sub 10% of the laptop market now and I would not be surprised if it is well under 5%.

Re:seperate mobile GPU's is declining market (1)

WiPEOUT (20036) | about 8 months ago | (#46470193)

The AMD and Intel integrated offerings while not amazing are more than adequate for the vast majority of purposes

Not only that, but the discrete graphics cards consume substantial amounts of power and generate more heat than the rest of the device combined.

Intel = bad drivers (1)

AMDinator (996330) | about 8 months ago | (#46470711)

Intel's refusal to properly support HDMI/DisplayPort to a TV without clipping the black levels makes their integrated GPUs worthless to me. I'm selling my newer laptop in favor of keeping my old one that has ATI graphics for this reason alone.

Re:seperate mobile GPU's is declining market (3, Interesting)

guises (2423402) | about 8 months ago | (#46471459)

No personal experience with this, but according to Anandtech Intel's Iris Pro graphics are reasonably fast but don't provide any power consumption advantage over discrete offerings. In fact they're worse, and with the power benefits in the new chips mentioned above they should be a lot worse in the future. Seeing as power consumption and cost are the only compelling reasons to be using integrated graphics, discrete chips still seem to have a fair amount of life in them.

Re:seperate mobile GPU's is declining market (1)

Hal_Porter (817932) | about 8 months ago | (#46471955)

I would imagine Nvidia are very uncomfortable with the way their market has been contracting over the last couple of years.

At some point enough x86/x64 patents will expire that Nvidia will be able license the remaining ones and so an x64 chip of their own.

Or alternatively they could sell Arm+GPU SOCs instead - arguably Arm+GPU is a better bet than x64+GPU because the sales of phones and tablets will exceed the sales of x64 PCs. Of course the margins are likely to be thinner because there's a lot of competition in the Arm SOC market - Apple and Samsung have their own in house designs and outside that it looks like Qualcomm have most of the rest of the market.

Still it's not like AMD is doing very well competing with Intel. And the reason Qualcomm do so well is because they design their own Arm microarchitectures - Scorpion and Krait were both designed in house and were higher performance than the best Arm designed microarchitecture. So I guess NVidia could be aimed to compete with Qualcomm since Denver is in house too.

Actually Apple A6 and A7 chips are like this too. Apple have an Arm license but the chips are designed in house. So it seems like of the Arm SOCs that actually sell well only Samsung is using Arm's designs and only in some markets

E.g.

http://en.wikipedia.org/wiki/S... [wikipedia.org]

Galaxy S4 models use of one of two processors, depending on the region and network compatibility. The S4 version for North America, most of Europe, parts of Asia, and other countries contains Qualcomm's Snapdragon 600 system-on-chip, containing a quad-core 1.9 GHz Krait 300 CPU and an Adreno 320 GPU. The chip also contains a modem which supports LTE. Other models include Samsung's Exynos 5 Octa system-on-chip with a heterogeneous CPU. The octa-core CPU comprises a 1.6 GHz quad-core Cortex-A15 cluster and a 1.2 GHz quad-core Cortex-A7 cluster. The chip can dynamically switch between the two clusters of cores based on CPU usage; the chip switches to the A15 cores when more processing power is needed, and stays on the A7 cores to conserve energy on lighter loads

So there are two versions. A Qualcomm Snapdragon one for the US and Europe and an Exynos one for Asia. The Exynos one uses Cortex-A15 and Cortex-A7 in a BIG.little configuration.

Unfortunately they fucked up the big.LITTLE configuration

http://www.anandtech.com/show/... [anandtech.com]

The Exynos 5410 saw limited use, appearing in some international versions of the Galaxy S 4 and nothing else. Part of the problem with the design was a broken implementation of the CCI-400 coherent bus interface that connect the two CPU islands to the rest of the SoC. In the case of the 5410, the bus was functional but coherency was broken and manually disabled on the Galaxy S 4. The implications are serious from a power consumption (and performance) standpoint. With all caches being flushed out to main memory upon a switch between CPU islands. Neither ARM nor Samsung LSI will talk about the bug publicly, and Samsung didn't fess up to the problem at first either - leaving end users to discover it on their own.

You can see the results here

http://www.gsmarena.com/samsun... [gsmarena.com]

The Qualcomm one has much better talk time - almost twice as much.

You have to wonder what the hell has happened to Arm to be honest. It seems like Apple (A6, A7) and Qualcomm (Scorpion, Krait) do a much better job at Arm core design than Arm/Samsung.

It'll be interesting to see battery life tests on the Snapdragon 801 and Exynos 5422 versions of the S5 to see if Samsung have got big.LITTLE working like it is supposed to. Actually I wonder whether big.LITTLE is even necessary - it seems like it would be much easier to just have the big core and scale the CPU frequency. The S5's CPUs are 2.5 GHz quad-core for the Snapdragon variant 2.1 GHz quad-core Cortex-A15 and 1.5 GHz quad-core Cortex-A7 for the Exynos variant.

That's quite a step up from the S4 so you could probably run them at a much lower clock frequency most of the time. I guess the problem is that a Cortex-A15 uses more juice when run at a low speed than an Krait and big.LITTLE (aka 'use an A7 instead when performance isn't critical') was a sort of band aid for this.

Re:seperate mobile GPU's is declining market (1)

Blaskowicz (634489) | about 8 months ago | (#46472429)

At some point enough x86/x64 patents will expire that Nvidia will be able license the remaining ones and so an x64 chip of their own.

But after x64 there were SSE3, SSE 4.x, AVX, AVX2, now AVX512 coming soon. Those are the wide SIMD instructions. This stuff isn't strictly needed - yet, already SSE2 gets needed to run some 32bit code like some flash versions and codecs, this annoys some current Athlon XP users. Maybe some other stuff like hardware encryption is "protected".

So the fullest x86/x64 support will be left to AMD and Intel only for the foreseeable future.
Nvidia is betting on ARMv8, with an ARMv8 + Kepler (of the GK208 variant) chip this year and probably ARMv8 + Maxwell next year.

Re:seperate mobile GPU's is declining market (1)

chrish (4714) | about 8 months ago | (#46472181)

Given the ridiculous prevalence of laptops with absolutely pathetic displays (1366x768 on a 15"? really?), "most" users aren't even going to need the integrated Intel 4th gen video. A dumb frame buffer would probably fit their needs.

It doesn't matter (0)

Anonymous Coward | about 8 months ago | (#46472911)

Mobile gpu drivers are crap, they are an afterthought and after about 6 months they seem to be forgotten. Cooling is often a problem also. Laptop solutions are just utter shite. (Unless you get a quadro, then at least your drivers will be usable).

Re:seperate mobile GPU's is declining market (1)

tlhIngan (30335) | about 8 months ago | (#46473755)

I thought I would always want discrete graphics. But nowadays the majority of laptops really have no need of it, The AMD and Intel integrated offerings while not amazing are more than adequate for the vast majority of purposes. my latest 2 laptops both use integrated Intel 4th gen and handle laptop needs completely for both my work and the limited gaming I do on a laptop. I would imagine Nvidia are very uncomfortable with the way their market has been contracting over the last couple of years.

Heck, we just got a bunch of new PCs in and we think the external discrete graphics was merely a checkbox item.

The computers had Intel Haswell 4770 processors, and also came with NVidia GeForce 620 cards (really low end cards). It's so far been a tossup - Windows finds the Haswell GPU far faster than the 620, benchmarks online seem to put the 620 only a tiny bit faster, etc.

Enough that we don't know whether to leave them in or take them out

Re:seperate mobile GPU's is declining market (1)

Bitmanhome (254112) | about 8 months ago | (#46479669)

By ordering low-end GPU, you annoy everyone -- the users have to put up with crappy chips, IT has to support more complex systems, and budgeting has to pay for chips noone wants. So instead, order most of the laptops without discrete GPU to save a few bucks. Then order a few with high-end GPU for the few people who want them.

'Leverage' is a noun... (-1)

Anonymous Coward | about 8 months ago | (#46469621)

... how I hate businessmen...

It is NOT a verb. You can't 'leverage' something.

Re:'Leverage' is a noun... (0)

Anonymous Coward | about 8 months ago | (#46470295)

AWWW SHlllllllllll~T, S0ME0NE T00K A N0UN `N VERBED lT!!!!1

Re:'Leverage' is a noun... (1)

cryptizard (2629853) | about 8 months ago | (#46474039)

That statement is contrary to the OED so... I'm going with them.

Advert disguised as story (0, Troll)

digitaltraveller (167469) | about 8 months ago | (#46469629)

This takes the cake. I've never complained once about an obvious advert disguised as a story.

But to pimp this, this CRAP company that has been so incredibly hostile to the free and open source community is such bad judgement.
The new slashdot management seems determined to undermine the loyalty of their userbase. What a disgrace.

Re:Advert disguised as story (0)

Anonymous Coward | about 8 months ago | (#46469785)

I still get the awesome DV - Series laptops from HP in my repair shop. I hate them. It's great when you have to tell customers "You're fucked most likely." Just needed to vent... :]

Re:Advert disguised as story (0)

Anonymous Coward | about 8 months ago | (#46470141)

At least their drivers work most of the time.. The lack of decent radeon drivers make them non-starters.

X265 decoding in hardware? (0)

Anonymous Coward | about 8 months ago | (#46469639)

Does the GeForce 800M series have hardware support for x265 (HEVC) decoding?

Can it do HDMI 2.0 and support 4k output to a UHDTV?

I don't need more powerful. I just need cooler! (1)

GuitarNeophyte (636993) | about 8 months ago | (#46469963)

I've toasted two laptop monitors because of trying to play too many high-needs video games on them. Both of the monitors theoretically were good enough for the games by specs, but both of them burnt out within two years of when I bought them (admitedly, they were both a couple years old when I purchased them). With the first laptop, I just thought it was an age thing and didn't think enough of it, but with the second one, I realized the sad pattern. Now, I play my games with an external fan running, blowing cool air under the laptop, with the laptop on a stand to increase airflow. (I'd stick to playing games on desktops, but I'm living abroad at the moment, and a desktop takes up too much room in the suitcases)

All this being said, it seems like each new generation of laptop video card is more powerful, but I would just like to know of a mid-range card that ran really cool. That would help me much more than power.

Re:I don't need more powerful. I just need cooler! (1)

jones_supa (887896) | about 8 months ago | (#46471299)

Monitors?

Re:I don't need more powerful. I just need cooler! (0)

Anonymous Coward | about 8 months ago | (#46471305)

It really depends on what your gaming needs are. Despite the bias against them by many integrated graphics in the top end chips are good enough for low to mid range gaming and they stay many times cooler than the heat monsters that are the discrete graphics. If you are in to the cutting edge or COD type FPS's then anything that does the job required in a laptop generates huge heat.

Re:I don't need more powerful. I just need cooler! (1)

Blaskowicz (634489) | about 8 months ago | (#46472567)

The new Maxwell stuff quite possibly has higher performance per watt than the Intel GPU. This may make the dedicated GPUs a bit more interesting again (and if the Intel GPU doesn't run, more watts can be spent on the CPU performance which can allow better framerate). Sure, stay modest enough on the wattage.

Re:I don't need more powerful. I just need cooler! (0)

Anonymous Coward | about 8 months ago | (#46471813)

There is no way this is true. You simply have no fucking clue what you are talking about.

Re:I don't need more powerful. I just need cooler! (1)

Blaskowicz (634489) | about 8 months ago | (#46472485)

I'm with the others not understanding what you're on about with monitors, but indeed additional cooling is useful. The thing is no matter how efficient the CPU and GPU are, it's a product of the watt budget and how the laptop is designed. Depends on the thickness/thinness, heatsinks and fans, build quality etc. so it's really on a laptop per laptop basis.

Modern stuff also throttles, it gets slower when needed so the laptop won't melt itself, that plays in both sides.. Less chance of failure, but additional cooling now is needed to get higher performance and a cynical laptop vendor may exploit this by undersizing the cooling esp. with the model that has a bigger faster GPU.

Re:I don't need more powerful. I just need cooler! (1)

GuitarNeophyte (636993) | about 8 months ago | (#46475307)

My appologies for not being very clear. I don't have my tools here, nor do I have random spare parts to change out and test which components have problems with my system at the moment. The two laptops could have had different problems, but the net result was that the screens on both were no longer functional. It was my presumption (as I said, I don't have my tools to verify) that excess heat was the cause of my computer issues. For that reason, I would prefer to have a cooler-running laptop, so I don't feel compelled to use additional cooling. My presumptions could be completely wrong, and I fully admit the deficiencies in my explanation.

Re:I don't need more powerful. I just need cooler! (1)

Blaskowicz (634489) | about 8 months ago | (#46477535)

No problem. Also my above reply was pessimistic, better to check some reviews after finding a nice model.
The new GTX 850M and GT840M feel nice (the latter being rather slow if you're into demanding games)

thanks for the report (0)

Anonymous Coward | about 8 months ago | (#46470195)

cool. was wondering when Nvidia was going to release the 800 series GPUs

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?