• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX TITAN 6 GB

This card is just a joke...I can see the promotional campaigns already:

GTX Turd of the Titan...one year late, overpriced at $1000, does not perform like a $1000 GPU, does not use the quality materials of a $1000 GPU (this doesn't use magnesium alloy of the 690 because it was "too expensive") and cannot even run future console ports with 6GB VRAM, since PS4 will use 8GB GDDR5. A gigantic fucking failure is what this card is. Judging by the release of this card, I will most likely skip the Kepler GPUs entirely at this rate.
 
This card is just a joke...I can see the promotional campaigns already:

GTX Turd of the Titan...one year late, overpriced at $1000, does not perform like a $1000 GPU, does not use the quality materials of a $1000 GPU (this doesn't use magnesium alloy of the 690 because it was "too expensive") and cannot even run future console ports with 6GB VRAM, since PS4 will use 8GB GDDR5. A gigantic fucking failure is what this card is. Judging by the release of this card, I will most likely skip the Kepler GPUs entirely at this rate.

That's just too good a rant. :laugh:

I think the card is really good, but it is very late and ridiculously overpriced. If I was in the market for a card in this price range, I'd buy the 690 as it handily outperforms it. SLI issues be damned.

As it is, I just got myself an overclocked GTX 590 for the same price as a 680, which outperforms it and isn't all that far behind a Titan.
 
This card is just a joke...I can see the promotional campaigns already:

GTX Turd of the Titan...one year late, overpriced at $1000, does not perform like a $1000 GPU, does not use the quality materials of a $1000 GPU (this doesn't use magnesium alloy of the 690 because it was "too expensive") and cannot even run future console ports with 6GB VRAM, since PS4 will use 8GB GDDR5. A gigantic fucking failure is what this card is. Judging by the release of this card, I will most likely skip the Kepler GPUs entirely at this rate.

not sure if serious.....
 
not sure if serious.....


I'm dead serious. Last gen consoles had what, 256MB (PS3) and 512MB (360) VRAM? Now show me a graphics card that came out before the 360 and the PS3 with less VRAM than the consoles that can RUN any of the recent console ports, let alone PLAY them at anywhere near the same level of visual quality...oh that's right, you can't, because console ports are insanely bloated by the time they hit the PC. And because they are physically incapable of doing so due to not having enough VRAM or the API better than what the consoles have -- DX10 (as these cards did not hit the market until long after the consoles were released). I know it's usually a silly idea to play the waiting game with hardware releases but with the current retarded GPU pricing from Nvidia, it has never made more sense to wait than now.

P.S. W1zzard, you may want to change the review a bit in regards to the magnesium alloy, as this card doesn't use it (see here. Even the guy from Nvidia confirmed it in this live launch video (skip to 07:00)).
 
Last edited:
inb4
please rewrite Your post in at least a bit more structured manner. Because I have a very hard time grasping the point You are trying to make due to Your post being a complete mess.
Thank You.
 
Hey how come the Titan and 690 don't have bumpers like the 480 did?

I'm dead serious. Last gen consoles had what, 256MB (PS3) and 512MB (360) VRAM? Now show me a graphic card that came out before the 360 and the PS3 with less than 256MB VRAM that can run any of the recent console ports, let alone at anywhere near the same level of visual quality...oh that's right, you can't, because console ports are insanely bloated by the time they hit the PC. And because they are physically incapable of doing so due to not having DX10 (as these cards did not hit the market until long after the consoles were released).

That ram isn't dedicated to the gpu. It may even end up being hard partitioned like the PS3s. If the new wave of consoles were just game dedicated they would have been in the 2-3 gb range. The ram quantity is to support heavy PC-like multitasking and bloat. Even the richest next gen games will fall well within the Titan's vram limits.
 
Hey how come the Titan and 690 don't have bumpers like the 480 did?



That ram isn't dedicated to the gpu. It may even end up being hard partitioned like the PS3s. If the new wave of consoles were just game dedicated they would have been in the 2-3 gb range. The ram quantity is to support heavy PC-like multitasking and bloat. Even the richest next gen games will fall well within the Titan's vram limits.

While that is debatable at this time, I can almost guarantee you that it will pretty much be entirely dedicated to the GPU, because the only reason system RAM existed in games consoles was either to

A. Run the operating system/graphical user interface OR
B. Used as cache/shared VRAM when the GPU ran out of dedicated VRAM OR
C. Pass data from the hard/optical drive to the VRAM of the GPU.

By using a tonne of fast VRAM and sharing it with both the processor and the graphics chip, they've removed the need for normal system RAM. That removes the need for options B and C (stated above), which means the only thing it will run is the graphical user interface & the OS, the rest will be entirely free to use for the graphics chip. The operating systems in consoles use about 1/20th of the memory resources of a desktop PC, if not much less (I read somewhere that PS3 uses about 8MB RAM for the OS when gaming -- that's about a 1/100th of the memory usage of Windows 7 at idle). Therefore you will need a graphics card with at least the same amount of VRAM to even stand a chance, regardless of the Titan's GPU performance.
 
Last edited:
Games use a lot of memory for other assets - sounds, AI, physics engine, and a lot of other stuff in a game uses up a lot of memory which are NOT graphics related. Thus saying all the usage is from the graphics is false.
So when speaking of console ports, You have to take in mind that the memory a game is going to use is going to be distributed between the vRAM and system RAM. Definitely not a 1:1 split, but really, at least a gigabyte of game's assets would have absolutely 0 reason to be put on the vRAM.

So stop implying a GPU needs to have as much dedicated vRAM as a console has in total just to be able to run them.
 
Therefore you will need a graphics card with at least the same amount of VRAM to even stand a chance, regardless of the Titan's GPU performance.
Sounds unlikely in the extreme. A consoles spec needs to carry it through its entire life cycle - you either over-engineer or don't create enough separation between the previous consoles.
If you are under the impression that 6GB of GDDR5 will fall short of the requirement for the next gen of games, then every single gaming card built thus far becomes obsolete overnight...and do you really think that either vendor and the game developers are going to welcome that particular user base backlash ?
 
I think people are forgetting that the PS3 has 8Gb of shared GDDR5. I bet you in most cases that the PS3 will initially use significantly less of that memory for VRAM and will shift as the platform ages and more games come out for it.
 
I think people are forgetting that the PS3 has 8Gb of shared GDDR5. I bet you in most cases that the PS3 will initially use significantly less of that memory for VRAM and will shift as the platform ages and more games come out for it.

That was exactly my point!
 
Games use a lot of memory for other assets - sounds, AI, physics engine, and a lot of other stuff in a game uses up a lot of memory which are NOT graphics related.

Nonsense. That "lot" of memory you're talking about is nothing compared to the amount taken up by the textures, models, objects etc.

Thus saying all the usage is from the graphics is false.

I never said it was going to be all of it, what I said was it is going to be the vast majority (90%-95% of it). Even with your supposed 1GB of VRAM dedicated for physics, sound and whatever else, it still leaves you with 1GB more than what you get in the Titan.

So stop implying a GPU needs to have as much dedicated vRAM as a console has in total just to be able to run them.

That's because it DOES, maybe not to run the game, but definitely to get anywhere near the same image quality and performance; every previous console release will only prove this. Stop trying to imply otherwise.


Sounds unlikely in the extreme. A consoles spec needs to carry it through its entire life cycle - you either over-engineer or don't create enough separation between the previous consoles.
If you are under the impression that 6GB of GDDR5 will fall short of the requirement for the next gen of games, then every single gaming card built thus far becomes obsolete overnight...and do you really think that either vendor and the game developers are going to welcome that particular user base backlash ?

What user backlash? And what do you mean by "obsolete"? Yes, a console's spec needs to carry it through its life cycle of 5-10 years...and remember, a PC graphics card's specs only need to carry it for about 1 year or less until the next generation hits.

Remember last console generation around this time, when the 7800GTX dropped and Far Cry was the ultimate benchmark at the time, for which it was overkill? Think of Far Cry (or Doom 3, both of the games were THE benchmarks back then IIRC) as today's Crysis 3.

We're on the 600 series now, which is the equivalent of last console generation's 6000 series Nvidia GPUs, and the Titan is the equivalent of the 7800GTX. All it means is that your currently proclaimed "overkill" cards that are running current gen games across 1-3 massive monitors will change to being "good enough" graphics cards at running next gen ports at 1080p, with some graphical settings tweaked (up or down, depending on how demanding the game will initially be on the consoles). There will be no "backlash", the same way there wasn't one when they released a 512MB version of the 7800GTX a few months after the 256MB version hit (which I'm betting is what will happen with the GTX 780 or whatever they will call their fully functioning GK110 follow up to the Titan with more VRAM) and the same way the 8000 series eventually butchered every 7000 series GPU.
 
Last edited:
Nonsense. That "lot" of memory you're talking about is nothing compared to the amount taken up by the textures, models, objects etc.

Only recently have games been starting to become 64-bit. A lot of games are still 32-bit and have the 32-bit memory limitation so your "lot" is usually still constrained to 2Gb.
I never said it was going to be all of it, what I said was it is going to be the vast majority (90%-95% of it). Even with your supposed 1GB of VRAM dedicated for physics, sound and whatever else, it still leaves you with 1GB more than what you get in the Titan.

This statement is confusing. Could you elaborate on this?
That's because it DOES, maybe not to run the game, but definitely to get anywhere near the same image quality and performance; every previous console release will only prove this. Stop trying to imply otherwise.

No it doesn't. You actually have it backwards. If anything the PC needs more resources because there is added overhead from everything else the computer is running. I'm willing to bet that your installation of Windows is many times larger than the OS for the PS3 will be because of what it's designed for. VRAM also doesn't determine image quality. There are cases where you drop the quality to get it to run better but that's not strictly a function of memory and implying that it is, quite frankly is wrong and you're spreading false information.

What user backlash? And what do you mean by "obsolete"? Yes, a console's spec needs to carry it through its life cycle of 5-10 years...and remember, a PC graphics card's specs only need to carry it for about 1 year or less until the next generation hits.

More or less. I've had my first 6870 for several years now and only upgraded to crossfire because I could. Most games (even Farcry 3,) run pretty well on it. So I wouldn't be so specific to say that video cards last for a year because I had my first 6870 for a couple years before I got a second one.
All it means is that your currently proclaimed "overkill" cards that are running current gen games across 1-3 massive monitors will change to being "good enough" graphics cards at running next gen ports at 1080p, with some graphical settings tweaked (up or down, depending on how demanding the game will initially be on the consoles).

That's not the problem. The problem is that for the added shaders and components in the Titan, it's performance is underwhelming for a price tag of $1000 USD. The Titan is not overkill because in reality it's not incredibly faster than the 7970 or the 680. The only thing that is overkill about the Titan is the price tag. Otherwise it looks like a damn fine piece of hardware.
 
Only recently have games been starting to become 64-bit. A lot of games are still 32-bit and have the 32-bit memory limitation so your "lot" is usually still constrained to 2Gb.

Of which, about 1980MB is used for graphics. Vinska was trying to suggest that suddenly physics and sound engines have taken over our RAM when he/she couldn't be further from the truth. The progression of in-game physics and sound have been baby steps compared to the graphical leaps in games -- sound design has been at a standstill since around 2005 or whenever the last hurrah for EAX happened (which they don't use on consoles anyway) and physics have barely advanced in games, since almost all of the games use pre-calculated physics effects for the most impressive/demanding parts for easy porting between platforms.

This statement is confusing. Could you elaborate on this?

See above.

No it doesn't. You actually have it backwards. If anything the PC needs more resources because there is added overhead from everything else the computer is running. I'm willing to bet that your installation of Windows is many times larger than the OS for the PS3 will be because of what it's designed for. VRAM also doesn't determine image quality. There are cases where you drop the quality to get it to run better but that's not strictly a function of memory and implying that it is, quite frankly is wrong and you're spreading false information.

I have no idea what you're arguing about here. I specifically stated that current GPUs need at LEAST the same amount of VRAM to run next gen console ports to a good standard or at least at comparable level to the consoles. I think you read my comment backwards or something.

And YOU'RE the one seemingly spreading false information -- consoles, as you and every other PC enthusiasts believe, provide a "good enough" baseline of performance compared to desktops (and run at inferior minimal image quality compared to the PC), with driver overheads and so on (that you mentioned earlier), proves my comment 100% right.

More or less. I've had my first 6870 for several years now and only upgraded to crossfire because I could. Most games (even Farcry 3,) run pretty well on it. So I wouldn't be so specific to say that video cards last for a year because I had my first 6870 for a couple years before I got a second one.

That has nothing to do with what I said. What I said was a PC graphics card's performance lifespan is only relevant for the games that are released around their time or before their replacement is released, unless the developer is specifically aiming at lower tier PC hardware. Normally, nobody rages because their top of the line GPU has been superceeded by something faster and better -- it comes with the hobby.

That's not the problem. The problem is that for the added shaders and components in the Titan, it's performance is underwhelming for a price tag of $1000 USD. The Titan is not overkill because in reality it's not incredibly faster than the 7970 or the 680. The only thing that is overkill about the Titan is the price tag. Otherwise it looks like a damn fine piece of hardware.

I never disputed that. All I said was even though it is quite a bit more powerful than what the consoles are packing, the 6GB VRAM will limit its performance in next gen titles, judging by how badly most mainstream games are being ported nowadays.
 
Of which, about 1980MB is used for graphics. Vinska was trying to suggest that suddenly physics and sound engines have taken over our RAM when he/she couldn't be further from the truth.

Should I show You a dump of a typical game's memory map w/ annotations what data goes where?
There is A LOT of data a game needs to use that is not directly related to graphics.
 
Back
Top