Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

$3000 GeForce GTX TITAN Z Tested, Less Performance Than $1500 R9 295X2

Soulskill posted about 4 months ago | from the spending-a-lot-of-green-for-team-green dept.

AMD 151

Vigile writes: NVIDIA announced its latest dual-GPU flagship card, the GeForce GTX Titan Z, at the GPU Technology Conference in late March with a staggering price point of $2999. Since that time, AMD announced and released the Radeon R9 295X2, its own dual-GPU card with a price tag of $1499. PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamer's view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K), the R9 295X2 offered higher and more consistent frame rates, sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design. The Titan Z is quieter and uses much less power, but gamers considering a $1500 or $3000 graphics card selection are likely not overly concerned with power efficiency.

Sorry! There are no comments related to the filter you selected.

So glad it's over (4, Interesting)

TWX (665546) | about 4 months ago | (#47207531)

I'm so glad that I got the gaming bug out of my system when a ridiculously-priced video card was $300, and mainstream cards were in the $90-160 range...

This is ridiculous.

Re:So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47207625)

it ain't for gayming but for game development. hence the floating point power of the card.

Re:So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47208813)

Glorified dildo

Re: So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47209223)

But they typically don't have CAD quality drivers... These are generally WORSE than less expensive cards for apps that actually REQUIRE perfect 3d rendering

Re:So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47207627)

They still are in that range. These product are like Bugatti's or Lambos... I really doubt I'll ever know someone personally who owns one.

Re:So glad it's over (4, Informative)

crioca (1394491) | about 4 months ago | (#47207687)

I'm so glad that I got the gaming bug out of my system when a ridiculously-priced video card was $300, and mainstream cards were in the $90-160 range... This is ridiculous.

That's still pretty much the case; the difference today is that some people make, or try to make, their living off playing & broadcasting their gameplay. This means they need to be able to run the latest games at the highest specs, record and livestream all at the same time without missing a beat.

Re:So glad it's over (1)

bemymonkey (1244086) | about 4 months ago | (#47209469)

Isn't the recording & encoding part mostly CPU-dependent? And even if the graphics card is used to encode the video, isn't there dedicated H264 encoder hardware on these cards (meaning a budget card from the same generation shouldn't be any slower in this aspect)?

Re:So glad it's over (1)

geekoid (135745) | about 4 months ago | (#47207703)

I have 180 dollar gaming card that plays everything very well.
This is, frankly, stupid.There is no gain, and professional gamers want all the particulates and distractions turned off.

Re:So glad it's over (2)

NemoinSpace (1118137) | about 4 months ago | (#47207901)

I have a built in Nvidia 8300 GS that came with the hand me down dell XPS-410 my wife brought home from work.
I put 6 GB ram in even though Crucial and Dell both tell you it won't work. Should have went for 8 so I could have a bigger ram drive. - I actually ran out of memory the other night running 64 bit Waterfox! (that was a first.) I put a ragged old OCX SSD in it that I bought for $20 when OCZ put themselves out of business. Then I put windows 8 on it for $30. It refuses to update to 8.1. (how bad are the H1B's over at Microsoft anyway?) I will probably upgrade to CentOS 7 soon because this computer will probably last till 2020.
Isn't this sad? I can squeeze crap out of a buffalo nickle. But the real challenge is assembling absolute garbage into a fairly usable system.
I don't knock people that buy $1,000 video cards, they usually pay well for doing things like adjusting their "tiny fonts". Really, get off my lawn.

Re:So glad it's over (1)

TWX (665546) | about 4 months ago | (#47208603)

I'm in the same boat actually. My desktop is a dual-Xeon box that's almost fourteen years old now, still uses AGP, and still plays the few games that I want to play quite well. I am planning on finally migrating to a new box (processors in the current one are only 32 bit so I'm capped at the ~4gb memory limit) but it's served me well for many, many years.

I'm typing this post on an old Dell Latitude D420, which still works fine for surfing the web, though I have to limit youtube-type video to lower resolutions to keep it smooth.

Re: So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47209741)

Install windows server 2003. Or if that's not practical, I understand there's a hack available that lets you install its kernel in xp. It lets you use pae, which gets you 16gb even on a 32 bit processor.

Re:So glad it's over (2)

TapeCutter (624760) | about 4 months ago | (#47208901)

Video cards are not just for games these days, my $150 GTX 750 maxes out at just over a teraflop, which is significantly faster than any multi-million dollar pre-Y2K super computer ever built. I really can't see how vector processing can help anyone to adjust their fonts, but it can solve all sorts of difficult engineering, logistics, AI, and design problems. The fact you can do calculations on a commodity video card that (even with unlimited military budget) were simply not practical in the 1990's is nothing short of a technological miracle.

But hey, if you want to install a private sub-station and a 1990's super computer in your shed because your too tight to buy a new PC, who am I to judge?

Re: So glad it's over (2)

O('_')O_Bush (1162487) | about 4 months ago | (#47208511)

But they also want to play at very high resolutions, very high refresh rates (120hz-144hz), and are often recording as well...

It's not over until the Fat Lady sings. (0)

Anonymous Coward | about 4 months ago | (#47207719)

I am still waiting to buy a Quantum3D Alchemy rackmount system. Battlezone looks that good on this, the most efficient use of 3Dfx Voodoo2 technology; I can see the rough edges on Laura Croft that tipped me off that it's a transgendered sheman, and thus saved my life for such tasteless entertainment.

Re:So glad it's over (1)

Kjella (173770) | about 4 months ago | (#47207767)

I'm so glad that I got the gaming bug out of my system when a ridiculously-priced video card was $300, and mainstream cards were in the $90-160 range...

These cards exists because they make them for the compute/workstation/enterprise market, why not rebrand and sell for some insane amount of money? Just like Intel's $999 processors wouldn't exist without the Xeon line. You get plenty bang for the buck for $150-250 with the "normal" enthusiast cards topping out at $500-700, which I assume is not that much more after inflation. Of course if you insist on playing Crysis in UltraHD with everything dialed up to max nothing will be enough, but many games the last years have been console ports that'll run on any half-decent gaming PC.

Re:So glad it's over (1)

Luckyo (1726890) | about 4 months ago | (#47207781)

Actually a few 780s in a SLI will run that just fine.

Re:So glad it's over (1)

epyT-R (613989) | about 4 months ago | (#47208479)

That's what the quadro line is for.

Re:So glad it's over (1)

gl4ss (559668) | about 4 months ago | (#47208741)

yeah but if you got a line of cards where you flipped a bit for the drivers to read and treat it differently then why not make another swipe at that and take the top of the line from that line and flip a bit to say it's something else...

now there's so many youtube wannabe professionals that they can make good money from it and so many review sites that they'll get to selling 10 000 units for that shit only, easily justifying a production run. of course for 3k you can get a fucking laptop to play every game on the market... or 3 desktop systems to play every game.

Re:So glad it's over (1)

epyT-R (613989) | about 4 months ago | (#47208895)

I agree, it is dumb. There are suckers who'll pay it though.

Re:So glad it's over (3, Informative)

TapeCutter (624760) | about 4 months ago | (#47209029)

That's kind of what they do. Not sure about other cards but Nvidia cards handle compatibility with something called compute capability [slashdot.org] . A developer then makes the trade-off that will land somewhere between....

Extreme compatibility -- work on all nvidia cards and use none of the new hardware features.
Extreme performance -- work on only the latest cards and use all of the latest hardware features.

Nobody is buying $3K cards to play video games, they are using them to solve engineering problems, video games are just a convenient way to benchmark performance that is easily understood by laymen.

Re: So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47207847)

Gtx 750ti - I have some fun with that.

People don't buy very high end video cards ... (2, Interesting)

perpenso (1613749) | about 4 months ago | (#47208063)

People don't buy the highest performing video cards for gaming, they buy them mining virtual currency.

Keep that in mind when you see that great price for a used high end card. The card probably ran for an extended period of time over clocked to just under its "melting point" and just got replaced by an ASIC miner.

That was last year (2)

dutchwhizzman (817898) | about 4 months ago | (#47209497)

People that mine either mine scrypt style currencies that still run better on GPUs or they are using ASIC miners for at least a year already. Used high end cards are either NVidia which are sold because the gamer wants something new or is short on cash, or AMD when the owner wants a faster GPU for either gaming or scrypt coin mining. For scrypt coin mining on AMD, overclocking the GPU doesn't work, in general you have to clock down a bit unless you are lucky and you can overclock the memory enough to maximize output with the GPU at standard clock rates.

The highest performing video cards tend to cost so much more for their performance, that miners get slightly less performing cards for half the price of the top of the range. You can get the R9 270 and R9 290 cards for much less money than the R9 290x and now the R9 295x2. The amount of coins you can mine for the purchase price and power consumption are such, that you don't want to buy those "highest performing video cards" if all you do is mine.

The reason you can buy relatively new video cards from miners is because the prices fell after MtGox fell. The get rich quick thing didn't work out and they all need money to pay their power bills. These cards aren't burnt up technically, the miner's wallet is empty and he needs some way to recuperate part of his loss. Sure, some of those are highest end cards, because the miner didn't pay attention to the price/performance thing when he bought them, but most will be high in the midrange, especially since we've seen some new high end cards come out since the prices of crypto coins fell.

Re:So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47208263)

I don't care so much about the gaming aspect but these cards have saved me thousands versus CPU solutions. I'll continue to buy them (or at least my employer will) as long as we're doing simulations.

Re:So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47208277)

Is that so? Gaming GPUs are still in that price range, considering inflation. And GPUs in question here don't target the "gaming" market.
Matrox always used to sell insanely high priced cards. And those cards were very good at what they were doing, like CAD and 2D rendering, just not the latest 3d rendering for 'Gaming'.

Re:So glad it's over (2)

Sir_Sri (199544) | about 4 months ago | (#47208487)

Except that Titan isn't really a gaming card. The big draw is the double precision floating point performance. The GTX 780 - which is the same part for gaming purposes, is about 700 dollars (they have almost identical single precision performance, which is what gaming is), so 2 780's would be about 1500 dollars (to compare to the titan black dual GPU monstrosity).

And you don't need top end parts unless you're gaming on 4k (which is either a 3500 dollar monitor for a good one, or a ~500 dollar Seiki TV that is capped at 30fps with crappy colour).

There has *always* been more expensive hardware than most people need or want. But for people who have money there's nothing particularly wrong with having super expensive stuff. If you made 500k a year what would you spend it on? What about 5 million? What about 50 million?. If nothing else the power of one of these 3000 or 1500 dollar cards is going to be mainstream for 300 bucks in 3 or 4 years (or sooner if TSMC can get 20nm working), it doesn't do you any harm that someone else can buy it.

Re:So glad it's over (1)

timeOday (582209) | about 4 months ago | (#47209297)

Except that Titan isn't really a gaming card.

OK, tell that to NVidia [geforce.com] :

GeForce GTX TITAN Z is a gaming monster, built to power the most extreme gaming rigs on the planet. With a massive 5760 cores and 12 GB of 7 Gbps GDDR5 memory, TITAN Z gives you truly amazing performanceâ"easily making it the fastest graphics card weâ(TM)ve ever made.

This is a serious card built for serious gamers.

Hard to get more definitive than that.

OK, you can argue that NVidia is simply lying; that they engineer these for professional applications and then make a rebadged version to score an easy buck by conning ego-driven gamers. But what kind of defense is that?

Re:So glad it's over (0)

Anonymous Coward | about 4 months ago | (#47209427)

Still, this is hardly a proof that gaming as a whole is ridiculous, like the threat-starter claims. The genre might have become ridiculous because of consumer developments, that make modern games more and more like interactive movies instead of innovative games, but the conclusion that graphics cards like this are the reason is a non sequitur at best. Yes, there are products for 'enthusiasts', have been and will be. Just look at those overpriced Alienware PCs, bloated and unnecessary, but that's the free market and not an intrinsic attribute of gaming.

You wouldn't call expensive racing cars proof that driving as a whole has become a ridiculous thing. Sure you call the buyers of those cars names, but not driving itself. You also wouldn't cite the cost of Tianhe-2 and then call it ridiculous what has become of computing as a whole.
And before the argument of "productivity" comes up, it was already mentioned that gaming has become a profession, although it is till in its infancy. It's as productive as any sport that is meant to entertain viewers, of which most are perfectly accepted, because of tradition.

Re:So glad it's over (1)

eddy (18759) | about 4 months ago | (#47209341)

The Titan-Z was and is a PR product. It was conceived simply to create buzz around nVidia. They had the misfortune that AMD put out a better card before they could get the darn thing to market though. First they delayed it, then as pressure mounted they finally sneaked it out without much of the ado they were hoping for. I doubt there exists or will ever exist more than a couple of hundred Titan-Zs IN THE WORLD.

Anyone who tells you that this card "is for X" where X is something else than PR is wrong and/or lying. It doesn't make sense anywhere.

Re:So glad it's over (1)

RogueyWon (735973) | about 4 months ago | (#47209719)

I've owned two "top end" (as opposed to merely "high end") graphics cards in the days before I had a mortgage and when the top end of the market was still only in the $1,000 range. The first was an Nvidia 7950 GX2 and the second was an Nvidia 590. Both of them, frankly, were cranky, unreliable and difficult. It was also rare I took them anywhere near their performance limits. This latest trends towards super-priced cards is a combination of R&D and willy waving.

This wouldn't be slashdot without a car analogy, so...

A Bugatti Veryon sells for around $1.7 million (according to my hasty google search). Even compared to previous generations of supercars, that's pretty insane. But it doesn't mean that cars in general are getting more expensive. You can get something good enough for everyday tasks cheaper than ever. If you want something sportier, with a bit of performance, then adjusted for inflation, the price range is more or less what it always has been. Plus that "something sportier" will probably be a lot easier to manage and maintain than the Veryon, as well as a lot easier to drive to the shops in.

I'm on an Nvidia 680 now (the 590 crapped out after less than 2 years), paid a sensible price for it and have a card that can handle almost everything at 1080p with max or near-max detail (the exception being Watch Dogs, the PC port of which is a badly coded piece of shite).

Wrong premise (5, Insightful)

Anonymous Coward | about 4 months ago | (#47207541)

These cards should have been tested from the perspective of high performance computing or scientific application.

Re:Wrong premise (4, Insightful)

SpankiMonki (3493987) | about 4 months ago | (#47207611)

These cards should have been tested from the perspective of high performance computing or scientific application.

I don't think nVidia would want that.

Re:Wrong premise (4, Informative)

Nemyst (1383049) | about 4 months ago | (#47208265)

Um, no, if Nvidia didn't want that, they wouldn't give the Titans full double-precision performance in the first place. I'm thinking that aside from getting a few sales from overenthusiastic gamers, their main motivation for marketing this as a gaming card is so their compute customers don't stop buying Teslas.

Re:Wrong premise (2)

mlw4428 (1029576) | about 4 months ago | (#47209087)

You're absolutely right on that. They artificially lock out features that their higher-end non-gaming cards have (such as VT-d support, etc). Nvidia doesn't want YOU to use GTXs for computing or scientific applications...they want you to use cards like Tesla or Quaddro. In fact I bet the biggest difference between the GTX Titan Z and Telsa K40 is less price and more specific features. In fact when I looked the K40 was a bit pricier but was outranked in sheer performance (CUDA cores, pipelines, etc), but you can't virtualize GTX, it doesn't work with GRID computing, and a few other features.

Wrong premise (1)

Anonymous Coward | about 4 months ago | (#47207667)

Gaming graphics cards are optimised for high-end graphics rendering - scientific graphics cards are optimised for crunching numbers/running simulations.

That's like testing a car by trying to drive it underwater

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47207713)

From TFA's introduction:

"NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible."

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47208471)

Nope, they're the same chips.

The 'scientific' GPUs are re-badged gaming GPUs, sometimes with different RAM, sometimes with different clock-speeds etc., but the rules for binning are pretty arbitrary and changeable.

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47207777)

Nvidia would lose a ton of money if they let people use gaming cards instead of cards like the Quadro K6000 @ over double the price. I'd bet the restrictions are BIOS & driver based more than an actual hardware limitation. Review samples probably often come with a restriction that it only be tested against certain cards.

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47207891)

These work just the same as a K-series card. The difference is they do not have ECC memory, and they will not garner the same official support from nVidia as a compute card. These aren't the old Quadro/FireGL cards that would artificially prevent CAD acceleration.

Re:Wrong premise (2)

Blaskowicz (634489) | about 4 months ago | (#47207893)

Nvidia does let people use the full computing featureset and performance barring ECC memory on a GTX Titan. Memory capacity is high too (6GB) though now there's also GTX 780 with that amount..

They have this hierarchy (based on virtually the same cards, but drivers and segmentation differ, in increasing price order)
GTX 780 and 780 Ti (3GB or 6GB) < GTX Titan (6GB) < Tesla (w/ 5GB, 6GB or 12GB) < Quadro K6000 (12GB)

That gives :
- gaming and GPGPU, double precision FP artificially much slower
- gaming and GPGPU, double precision FP at regular speed
- GPGPU only, double precision FP at regular speed, ECC can be enabled (to number crunches for weeks and monthes on)
- all features of inferior models plus the support for CAD and industrial/high end software plus the weird features (quad buffer stereo, and miscellaneous) though the driver is not really meant to be good in games.

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47207915)

The TITAN series are entry level scientific compute cards though. A single 780Ti outperforms a single Titan or Titan black (revision 2) in gaming applications, but have restricted FP64 performance. The Titans have slightly lower performance versus the gaming counterparts, but have the full FP64 performance available.

The upgrade to the Quadro cards from the Titan cards is a reliability upgrade (ECC memory) and an alternate driver set that prioritises visual quality over performance.

Re:Wrong premise (1)

msauve (701917) | about 4 months ago | (#47208045)

You're missing the point (and marketing).

Overpaying by 20X makes you much cooler than overpaying by 10X. The metric is bragging rights, not actual performance, and definitely not some cost/benefit analysis.

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47208947)

2999 vs 3000 USD

Re:Wrong premise (1)

perpenso (1613749) | about 4 months ago | (#47208077)

These cards should have been tested from the perspective of high performance computing or scientific application.

Nah, virtual currency mining. :-)

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47208209)

ATI still outperforms NVIDIA even at this level... not to mention that ASICs surpass using gfx cards (even for the home hobbyist) by an order of magnitude.

Re:Wrong premise (0)

Anonymous Coward | about 4 months ago | (#47209491)

ATI still outperforms NVIDIA even at this level... not to mention that ASICs surpass using gfx cards (even for the home hobbyist) by an order of magnitude.

Using an order of magnitude less power.

Quiet is important (4, Insightful)

i_ate_god (899684) | about 4 months ago | (#47207543)

don't underestimate the beauty of a quiet powerful computer.

I won't buy a $3000 gpu anymore than I'll buy a $1500 one, but I did buy the GTX 780 over the cheaper but somewhat more powerful R9 250 solely on the basis of it being cooler.

Re:Quiet is important (2)

SpankiMonki (3493987) | about 4 months ago | (#47207593)

GTX 780 over the cheaper but somewhat more powerful R9 250

That's one heckuva typo. (I *hope* that's a typo)

Re:Quiet is important (0)

Anonymous Coward | about 4 months ago | (#47207875)

I imagine he meant the 290.... at least I hope so.

Re:Quiet is important (0, Troll)

Anonymous Coward | about 4 months ago | (#47208117)

Poor nerds, someone typed a number wrong, I can just feel your nerd pain.

Re:Quiet is important (2)

i_ate_god (899684) | about 4 months ago | (#47208503)

yes, it was the 290, not 250, sorry.

They were competing against each other, the amd card had slighter better bang for the buck but was reportedly quite hot and some boards were quite noisy.

Re:Quiet is important (1)

Balinares (316703) | about 4 months ago | (#47207619)

I really just wish desktops were capable of only turning on the discrete GPU when playing games, and relying on the CPU built-in one the rest of the time. (Or is it possible nowadays and I never found out?)

Re:Quiet is important (0)

Anonymous Coward | about 4 months ago | (#47207697)

They can do this with Lucid Virtu for intel-based CPUs.

Re:Quiet is important (1)

Qzukk (229616) | about 4 months ago | (#47207721)

It's a common laptop feature, but it works because both the awesome GPU and the cheap GPU are integrated.

You can do it on the desktop, you just have to buy a 3dfx Voodoo card :) (it had a passthrough cable so you would plug it into your regular video card then your monitor into the 3dfx card... without that you'd need to plug your monitor into your fancy gaming video card whenever you wanted to use it).

Re:Quiet is important (1)

Blaskowicz (634489) | about 4 months ago | (#47207975)

It might get possible in the future, or in select integrated desktops ; for now at least the modern big GPUs have much better power management than before. Showing the desktop or even idling with the screen turned off was a huge power waste when you ran a e.g. Radeon 4870 or GTX 275, but with a GTX 780 or Radeon 7970 it's almost a gentle power bump next to not having the card in the first place. Of note is Radeon "zerocore power" which does shut the card down, but only when the PC's display goes stand by.

Nvidia did have a real try at it (motherboard with geforce 8200 or 8300 chipset (integrated graphics), and geforce 9800GT or 9800GTX). Very few desktop PCs ever had that combinaison of hardware and then they exited the chipset market and had that stuff quietly forgotten.

Re:Quiet is important (1)

javy29sp (1723340) | about 4 months ago | (#47208407)

I use Lucid Virtu. My 3770K with Intel graphics runs my desktop and my HD7970 kicks in for games.

Re:Quiet is important (1)

Farmer Tim (530755) | about 4 months ago | (#47207859)

I did buy the GTX 780 over the cheaper but somewhat more powerful R9 250 solely on the basis of it being cooler.

Damn hipsters!

Good excuse... (1)

martiniturbide (1203660) | about 4 months ago | (#47207583)

...to raise the price of the R9 295X2. :)

crossfire/sli compatability (2)

rogoshen1 (2922505) | about 4 months ago | (#47207617)

Do games these days typically take full advantage of such setups? I haven't really paid too much attention to gaming/hardware in the past few years, but it seemed as if support for dual GPU's was less than stellar.

IE, the only true advantage was an increase in the memory available to apps -- computationally, very few games took advantage of the additional gpu.

Has this changed, or (equally likely) I am completely off base on the state of afairs ?

Re:crossfire/sli compatability (1)

arbiter1 (1204146) | about 4 months ago | (#47207701)

Most games support it but not always take full advantage of SLI/CF. Titan was more focus on low cost option as they said for people that need the double precision. As you see in that test 2x 780TI is 200$ cheaper and got some OC to it.

$3,000?? (0, Interesting)

Anonymous Coward | about 4 months ago | (#47207651)

i'm lost. why do people need a $3,000 video card to play games like World of Warcraft? I can play it fine on a $50 video card that takes one slot and a 15 inch monitor. Framerate is so fast that I had to turn on V-sync. I must be missing something.

Re:$3,000?? (1)

Anonymous Coward | about 4 months ago | (#47207693)

You are missing something. Every aspect of life that isn't World of Warcraft.

Re:$3,000?? (4, Insightful)

Luckyo (1726890) | about 4 months ago | (#47207789)

They don't. What they need this for is ghetto floating point development hardware. This is cheap by those standards and offers far more precision than consumer grade GPUs.

Re:$3,000?? (1)

msauve (701917) | about 4 months ago | (#47207841)

WTF is "ghetto floating point development hardware," and what is it used for that makes this cost effective?

Re:$3,000?? (0)

Anonymous Coward | about 4 months ago | (#47207913)

No ECC memory and a fraction of the price. It's used for the same thing that any compute card is used for, except that without ECC memory, you may not want to rely on it for your final simulations, so it's better suited as a low(er) cost development platform for such applications.

Re:$3,000?? (0)

Anonymous Coward | about 4 months ago | (#47207943)

Luckyo means a GP-GPU compute card (think CUDA or OpenCL) with good fp64 performance. Titans handle fp64 computations about 4 times faster than the gaming cards based on the same cores.

These would be used for things like fluid dynamic simulations or protein folding. Also used by SETI@home and einstein@home for signals analysis

Re:$3,000?? (0)

Anonymous Coward | about 4 months ago | (#47208185)

Titans handle fp64 computations about 4 times faster than the gaming cards based on the same cores.

That's suggesting it's merely a software lock in the driver, which it is not. Unlike AMD hardware that performs fp64 math on fp32 hardware at significantly reduced performance, nVidia actually uses native fp64 cores. Gaming dies get almost all fp32 cores. Compute and workstation dies, including the Titans, get a much higher percentage of fp64 cores. They are physically different hardware. These aren't the old days when you could reflash your Geforce card into Quadro.

Re:$3,000?? (0)

Anonymous Coward | about 4 months ago | (#47208697)

That's suggesting it's merely a software lock in the driver, which it is not.

Which is why you can't switch a Titan to 1/16 fp64 mode with a checkbox in the driver and get significantly increased clocks and gaming performance. Oh, wait.

Unlike AMD hardware that performs fp64 math on fp32 hardware at significantly reduced performance, nVidia actually uses native fp64 cores.

What. That's not even wrong.

Gaming dies get almost all fp32 cores.

Yes, by disabling 3/4 of the fp64 cores.

Compute and workstation dies, including the Titans, get a much higher percentage of fp64 cores.

Yes, because they don't get them disabled.

They are physically different hardware.

If you count "one chip has a specific config fuse blown, the other doesn't." as different, then... yes.

These aren't the old days when you could reflash your Geforce card into Quadro.

Yes. See above as to why.

Re:$3,000?? (2)

Luckyo (1726890) | about 4 months ago | (#47208135)

Double precision floating point hardware, designed to do things like physics simulation. This thing has no ECC and some other similar tradeoffs, so it's fairly cheap at only 3k.

Here's an example of a non-ghetto version: http://h30094.www3.hp.com/prod... [hp.com]

Re:$3,000?? (1)

msauve (701917) | about 4 months ago | (#47208457)

Oh, so not "development hardware," but a cheaper alternative to other FP computation options.

Re:$3,000?? (0)

Anonymous Coward | about 4 months ago | (#47209619)

Two Titan Black's in SLI is faster and cheaper than this. Only idiots would buy the Titan Z.

Re:$3,000?? (1)

Anonymous Coward | about 4 months ago | (#47207793)

You buy it because you accidentally set Crysis to maximum quality, and now you can't change it back because on your cheap $400 card your mouse is moving about 1 pixel an hour.

Re:$3,000?? (1)

UnknownSoldier (67820) | about 4 months ago | (#47209229)

You are missing 120 fps along with strobing, especially in indie games like Path of Exile, or Minecraft where the games haven't been properly optimized.

A nVidia GTX 780 Ti is the best performance for $700 without breaking the bank.

3000? (1, Informative)

geekoid (135745) | about 4 months ago | (#47207677)

Gamers spending 3000 on a video card aren't overly burdened with intelligence.

Re:3000? (2)

synapse7 (1075571) | about 4 months ago | (#47208289)

Man I wish I could be that dumb.

Re:3000? (0)

Anonymous Coward | about 4 months ago | (#47209525)

..Or very intelligent so they've become wealthy and don't need to limit money for their hobbies anymore.

Wrong tests (5, Insightful)

gman003 (1693318) | about 4 months ago | (#47207691)

The Titan shouldn't be considered a top-end gaming card. It should be treated as a budget Tesla card - even at $3k, it's the cheapest card in Nvidia's lineup with full double-precision floating point performance (which no game uses, but is common for scientific computing, Tesla's market). And on tests using that, the single-gpu Titan and Titan Black outperform the 295X2 by a large amount [anandtech.com] . AT hasn't gotten to test a Titan Z yet, but you can tell it's going to wipe the floor with the 295X2.

Yes, Nvidia advertised the original Titan as a super-gaming card, and to be fair it was their top-performing gaming card for a while. But once the 780 Ti came out, that was over, and since everyone expects a 790 dual-GPU gaming card to be announced soon, buying any Titan for gaming is a fool's choice.

Nvidia seems to still be advertising it as a top-end gaming card, presumably trying to prove the old adage about fools and their money. It just comes off as a scam to me, but anyone willing to spend over a grand without doing some proper research probably deserves to be ripped off.

Re:Wrong tests (3, Informative)

sshir (623215) | about 4 months ago | (#47207867)

Result of Nvidia's crippling DP floating point performance on mainstream graphic cards is people started to look for ways around this bullshit.

Case in point: linear algebra libraries (like 80% of scientific computing). Basically people are modifying algorithms so that bulk of computation is done in single precision and then cleaned up in double. Those mixed mode algorithms often outperform pure DP ones even on non crippled cards (for example MAGMA library).

People don't like to be screwed with...

Re:Wrong tests (0)

Anonymous Coward | about 4 months ago | (#47208139)

Result of Nvidia's crippling DP floating point performance on mainstream graphic cards

nVidia didn't cripple anything. nVidia decided that if they were going to support decent double precision performance on their compute platforms, they were going to need to do double precision natively. If they're doing double precision natively, there's no reason to also support it using the clumsy, multi-step, "long" method traditionally done using single precision units. Expensive compute dies that are actually going to use double precision math get a whole bunch of double precision cores. Cheap gaming cards that have no reason to do double precision math get a bunch of single precision cores, and only a handful of double precision ones, as the less capable cores keep cost, power consumption, and temperatures down.

You're just pissed off because their architecture choices mean they do not produce a product with the right performance and price point for you to use in your pointless bitcoin mining.

Re:Wrong tests (2)

Nemyst (1383049) | about 4 months ago | (#47208275)

Wait, in essence you're saying that by leveraging single-precision (which is still three times faster than double-precision even for Nvidia's compute cards) computations, libraries have been able to increase performance without compromising the quality of the results. How is that a bad thing, or people "getting screwed with"?

Re:Wrong tests (2)

EvilSS (557649) | about 4 months ago | (#47207967)

The Titan shouldn't be considered a top-end gaming card. It should be treated as a budget Tesla card - even at $3k, it's the cheapest card in Nvidia's lineup with full double-precision floating point performance (which no game uses, but is common for scientific computing, Tesla's market).

This. For gaming there is virtually no difference between a 780 Ti (~$700) and a Titan Black (~$1000). They look identical on gaming benchmarks. I imagine that a pair of 780Ti's in SLI would outperform the Titan Z when it comes to gaming (the Titan Z is underclocked compared to the Titan Black) and for less than half the price.

The difference is the unlocked floating point capability and added vram. The Titans are for number crunching. The TitanZ Crushes the AMD R9 295x2. Well, that and gamers looking for epeen cred.

Re:Wrong tests (0)

Anonymous Coward | about 4 months ago | (#47208273)

The difference is the unlocked floating point capability

It's not "unlocked capability". You cannot mod a 780Ti for improved double precision performance. They're physically different hardware, with different counts of SP and DP cores.

Re:Wrong tests (0)

Anonymous Coward | about 4 months ago | (#47207997)

The problem is if you were doing scientific computing you are unlikely to pick any of the Titan cards. They actually come in at fairly similar pricing to the tesla's, so far from being a budget card, the titan's are really very poorly built for such a job, They aren't made to be stacked in large clusters, they don't have ECC memory and don't have many of the interconnect support that a tesla would have and they aren't made to run at very high utilisation for long periods of time. These cards are definitely NOT meant scientific computing. Buying a titan for scientific computing would be the FOOL's choice. These are made for game development, if you have unlimited budget then maybe a gaming rig for bragging rights.

Re:Wrong tests (0)

Anonymous Coward | about 4 months ago | (#47209123)

I think the reasoning for NVIDIA marketing this as a gaming card is the support they will put behind it. It's still a card built for computing instead of gaming, but will receive the level of support associated with their gaming products and not their teslas or quadro lineups. This allows those who need the compute focused cards, but can't afford a Tesla or quadro to get the power and features they need, just without the added support that comes with a Tesla or quadro. Those who can afford it will most like still buy Tesla and quadro for the support and NVIDIA can pick up a few more sales from those who can't afford the cards with the added support costs factored in.

64Bit floating point and compute mode (3, Interesting)

Bleek II (878455) | about 4 months ago | (#47207707)

The titian line in not a purely gaming GPU! The higher price comes for leveraging it's GPGPU CUDA technology. It's like buying a server hardware and complaining it doesn't run your games and well as an i7 which costs less. Game enthusiasts always ruin hardware news with their one golden spec, the frames per second! "That said it’s clear from NVIDIA’s presentations and discussions with the company that they intend it to be a compute product first and foremost (a fate similar to GTX Titan Black), in which case this is going to be the single most powerful CUDA card NVIDIA has ever released. NVIDIA’s Kepler compute products have been received very well by buyers so far, including the previous Titan cards, so there’s ample evidence that this will continue with GTX Titan Z. At the end of the day the roughly 2.66 TFLOPS of double precision performance on a single card (more than some low-end supercomputers, we hear) is going to be a big deal, especially for users invested in NVIDIA’s CUDA ecosystem." - AnandTech

Re:64Bit floating point and compute mode (0)

Anonymous Coward | about 4 months ago | (#47207977)

Don't know if you'll read this, but FPS is no longer what gamers are looking for. We generally look for latency and consistency in the timing of the frames produced over the raw fps figure.

Besides, if gaming engines start leveraging directcomputer/cuda/opencl capabilities more (eg, using cuda5's unified memory capability to produce particles that are more than just eye candy, but actually feed back into gameplay) the computer performance will matter just as much.

Re:64Bit floating point and compute mode (0)

Anonymous Coward | about 4 months ago | (#47209053)

Which I noticed that review was lacking. In fact, I flipped through all the pages expecting to see at least a quality compare of the images and got absolutely none.

Re:64Bit floating point and compute mode (0)

Anonymous Coward | about 4 months ago | (#47209405)

Yet somehow you're still a sad virgin.

$3000 Bitcoin card (0, Flamebait)

Anonymous Coward | about 4 months ago | (#47207749)

Bitcoin has caused cards to go through the roof for mediocre performance. The gov needs to eradicate bitcoin asap so I can go back to gaming at an affordable price.

Re:$3000 Bitcoin card (2)

Luckyo (1726890) | about 4 months ago | (#47207797)

ASIC makers have done if for the government. GPU mining is a thing of the past.

Prices still haven't come down though.

Re:$3000 Bitcoin card (1)

perpenso (1613749) | about 4 months ago | (#47208091)

So the private sector solves the problem itself? :-)

Re:$3000 Bitcoin card (1)

Luckyo (1726890) | about 4 months ago | (#47208149)

They solved their problem (speed) without solving the problem that it caused for all the bystanders (price).

So yes, it's a pretty typical private sector solution that doesn't fix any of the problems caused to large audience by the original product, forcing customers to pick up the tab.

Re:$3000 Bitcoin card (0)

Anonymous Coward | about 4 months ago | (#47208293)

I don't think you can get much better performance than this [staples.ca] for only 52$CAD.

Re:$3000 Bitcoin card (1)

LordLimecat (1103839) | about 4 months ago | (#47208373)

TIL: Supply and demand is a "problem" that needs to be "solved".

Frame rate is the wrong metric (1)

mysidia (191772) | about 4 months ago | (#47208331)

It's SCrypt Hashes per Second per Watt of energy consumed. And SCrypt Hashes per Second per Dollar of GPU.

Jack of all tirades (0)

Anonymous Coward | about 4 months ago | (#47208509)

NVidia is spreading themselves too thin. They simply don't generate enough Tegra business to justify their SOC R&D. Unless they can convince Nintendo to go software only and license their games to the NVidia Shield exclusively they will never truly turn a profit in the SOC business. The fact that the their compute GPUs are pulling double-duty as flagship gaming parts, and diminishing their GPGPU margins by competing with themselves, is yet another proof point that they are treading uphill on a slippery slope.

Should we start worrying about nVidia? (1)

sshir (623215) | about 4 months ago | (#47208571)

I know that this is not a purely gaming card bla bla bla. But here is another ping: in this month's graphics card review at Tom's hardware AMD totally dominated... in all categories. I mean a clean sweep! What's going on? Or is it just bad timing?

Re:Should we start worrying about nVidia? (1)

Barny (103770) | about 4 months ago | (#47209117)

Link? The only benchmark lists I could find there only tested FPS. And we all know that such tests are quite silly now as quality and to some extent latency are important these days.

Serious question (0)

Anonymous Coward | about 4 months ago | (#47208815)

Why are they comparing a GPGPU card to a rendering/gaming card?

Nvidia (0)

Anonymous Coward | about 4 months ago | (#47209271)

The only reason i stick with Nvidia is their cards for whatever it is they do, tend to age Far Far more gracefully than ATI
While if you go all the way back to say an 8800 series it definately wont do the highest setting but it will at least run the game,
an ati from that generation may not even load.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?