# NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"



## btarunr (Jan 21, 2013)

2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator. 

NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.





*View at TechPowerUp Main Site*


----------



## Naito (Jan 21, 2013)

A possible late-February launch? Very interesting indeed. 

Geforce Titan doesn't quite roll with me. Nvidia has lost the plot even further with their naming conventions.

Rumour of only 14 out of 15 shader clusters is a little disappointing considering how long they have been producing Kepler chips, but I guess it is a much larger core. Is this going to remain branded as 600 series (not mentioned in article)? If so, it could be possible the core may be hitting a performance range that might not slot in well with the future 700 series performance targets, thus they trimmed an SMX unit off.

Edit: Maybe it will slot in near the GTX 690 or even supersede it? GTX 690 3072 shaders vs GK110 2688 shaders without possible SLI overhead? If not, might just slot in under, as to not annoy the people who bought the GTX 690.

Edit 2: Checked original article. Claims 85% of the performance of the GTX 690. Possibility of limited edition card? (when considering naming): "Partner companies must comply with Nvidias reference design to the letter and may not even put their stickers on graphics cards."


----------



## tastegw (Jan 21, 2013)

Cores: 1536->2638
Interface: 256->384
Ram: 2/4->6 (overkill but nice)
I think I'll take two

So this isn't the 780?


----------



## DarkOCean (Jan 21, 2013)

"The same sources claim that Geforce Titanium released in late February and has a suggested retail price of *899 USD*." holly Jesus


----------



## sc (Jan 21, 2013)

Time to put my 690 up for sale. Damn consumerism...


----------



## Optimis0r (Jan 21, 2013)

Not an average consumer gaming graphics card then.


----------



## kniaugaudiskis (Jan 21, 2013)

Well, with such a price tag and a non-GTX branding this card may be a limited production enthusiasts' dream.


----------



## Fluffmeister (Jan 21, 2013)




----------



## Shihab (Jan 21, 2013)

Anyone interested in a cheap Kidney?


----------



## qubit (Jan 21, 2013)

I'll bet it will perform similarly to a GTX 690 and be priced extortionately between £800 to £1000.


----------



## 1c3d0g (Jan 21, 2013)

Awesome!  This fucker will be an incredible asset for us BOINC/Folding@Home enthusiasts, enabling us to achieve even better results for battling nasty diseases, researching black holes/entire galaxies, understanding vague physics processes and more!


----------



## Samskip (Jan 21, 2013)

If it really packs 6GB of RAM it would be an awsome card for driving 4K displays.


----------



## Filiprino (Jan 21, 2013)

Whoa, 6 GB of GDDR5 is great. A TDP of 235W isn't.
At least make it SLI friendly with only 1 slot of connectors instead of two. In that way you can use a waterblock and have 1 more PCIe connector of your motherboard available.


----------



## dj-electric (Jan 21, 2013)

Waiting for the SuperFunTime ill have with this card


----------



## the54thvoid (Jan 21, 2013)

sc said:


> Time to put my 690 up for sale. Damn consumerism...



Why?



Naito said:


> ...Checked original article. Claims 85% of the performance of the GTX 690



Lose performance....


----------



## Fluffmeister (Jan 21, 2013)

the54thvoid said:


> Why?
> 
> 
> 
> Lose performance....



Single GPU is always going to preferable, sure the 690 is a wonderful piece of kit, but it still relies on decent SLI profiles to perform at it's best, and even then not every title will scale that well.

Throw in some extra functionality and still packing the same amount of CUDA cores as TWO 670's and I'd think you'd be on to a winner.


----------



## BigMack70 (Jan 21, 2013)

Hmmmmmmmmmm... depending on what actually happens with price/performance I could see myself selling off my 7970 Lightnings and going for one of these bad boys... If I can get anywhere near 7970 CF performance from a single GPU, I'm in.

Multi-GPU is a hassle comparatively.


----------



## _JP_ (Jan 21, 2013)

*I've got a biiig pair of Titans!! Wanna see them?*

But can it run Battlefield 4 @ 2560x1600 & 16xAF & 4xAA & Ultra settings?


----------



## blibba (Jan 21, 2013)

Given the increased memory bandwidth, this might only manage 85% of the throughput of the 690, but I bet it'll have lower 99th percentile frame times.


----------



## sc (Jan 21, 2013)

the54thvoid said:


> Why?
> Lose performance....



Because... there will be a dual GK110 board.


----------



## Kaynar (Jan 21, 2013)

What is the broad estimation of release date for these cards? Easter? Summer? Autumn?


----------



## the54thvoid (Jan 21, 2013)

Google translated from Sweclockers:



> When NVIDIA released the GeForce GTX 680 in March 2012 it was clear that the new flagship was not based on the full version of the GPU Kepler (GK110). Instead, a cut-down version (GK104) to hold down the manufacturing costs. Now, almost a year later, full-fledged Kepler be heading to a Geforce in consumer class.
> Multiple independent sources say the SweClockers to GK110 appears in Geforce Titan - an upcoming graphics cards in high-end class. The name alludes to the world's fastest supercomputer Cray Titan at Oak Ridge National Laboratory in the USA, which is based on 18,688 pieces of Nvidia Tesla K20X with just GK110.
> 
> Calculation Card Nvidia Tesla K20X based on GK110 with 2688 CUDA cores, 384-bit memory bus, and 6 GB of GDDR5 memory. The circuit actually contains 2880 units, but NVIDIA has disabled a cluster (SMX), presumably for production reasons. While stopping the clock frequencies at relatively low 732 MHz GPU and 5.2 GHz GDDR5.
> ...


----------



## Fluffmeister (Jan 21, 2013)

Sounds like it's gonna be another sexy reference design card using magnesium alloys and the like.

Bring it on.


----------



## ThE_MaD_ShOt (Jan 21, 2013)

900 bones is a little steep for most to justify. Wow


----------



## Shihab (Jan 21, 2013)

Filiprino said:


> Whoa, 6 GB of GDDR5 is great. A TDP of 235W isn't.
> At least make it SLI friendly with only 1 slot of connectors instead of two. In that way you can use a waterblock and have 1 more PCIe connector of your motherboard available.



Don't get greedy now, the ol' GF110 was rated at 244w when fitted to a GTX 580, the current 680's rated at 190-ish, and the current GTX 690's given a 300w TDP. 

235w isn't much IMO if it packed the processing power it promises.


----------



## tastegw (Jan 21, 2013)

ThE_MaD_ShOt said:


> 900 bones is a little steep for most to justify. Wow



Cheaper than two 680's, but many still went that route.


----------



## T4C Fantasy (Jan 21, 2013)

it may just be a 780 Ti


----------



## Crap Daddy (Jan 21, 2013)

This is the higher end of high end so shut up and pay if you want the most powerful single GPU in the universe. That's the message here. Premium product from a premium brand. They can launch whenever they like at whatever price, I don't see AMD having anything close to GK110.


----------



## the54thvoid (Jan 21, 2013)

Well, if this is what Kepler was 'meant to be' then it explains a lot.  I can't see AMD coming close to this but then again, if the Sweclockers info is correct, that price for a single gpu is terrible.
Remember, this will be out 14 months after the 7970 released and it looks like it's the old Nvidia technique of massive die size.  One monolithic gpu to rule them all, as it were.

I wish we knew the whole truth about this card.  Is this to the GK100 what the 580 was to the 480?  Is this the card they couldn't make a year ago? Remember the initial rumours were that Nvidia were going to learn from the past and release the lower end cards first?  Maybe it's taken this long to get it right (and they didn't have to rush due in part to the initial low clock and immature driver performance of the 7970).

This looks like the card I wanted 1 1/2 years ago but not at that price.  

You beat me to it:


Crap Daddy said:


> I don't see AMD having anything close to GK110.


----------



## T4C Fantasy (Jan 21, 2013)

Crap Daddy said:


> This is the higher end of high end so shut up and pay if you want the most powerful single GPU in the universe. That's the message here. Premium product from a premium brand. They can launch whenever they like at whatever price, I don't see AMD having anything close to GK110.



well the 7970 ghz edition still has more tflops than the k20x lol


----------



## Easy Rhino (Jan 21, 2013)

awesome. so now instead of running the most demanding PC games on max settings at 100FPS we can run them at 200 FPS.


----------



## Bytales (Jan 21, 2013)

*Please stop folding*



1c3d0g said:


> Awesome!  This fucker will be an incredible asset for us BOINC/Folding@Home enthusiasts, enabling us to achieve even better results for battling nasty diseases, researching black holes/entire galaxies, understanding vague physics processes and more!



Please spare me the battling disesase, curing cancer Phrase.
There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.

The one ruling the world, who purposely try to kill as much of us as possible, dont want these knowledge loose, thats why they are not generaly accepted, nor reasearched, but they do work let me tell yah that.

And nobody needed 2880 CUDA GPU to come up with these cures let me tell yah that.

Bassicaly, these FOLDING at home sh.it, is using your resource, your electricity, your money, your work, for god knows what, that only they, the elite few will have in the end acces to.
So you, the poor user, gets nothing in the end, your money are beeing taken, and dont kid yourself, you wont be getting any cancer cure anytime soon.

Poor fools ! Imagine all these folding team, burning electricity, and besides a place in a highscore list, get nothing in return.


----------



## Fluffmeister (Jan 21, 2013)

T4C Fantasy said:


> well the 7970 ghz edition still has more tflops than the k20x lol



AMD cards have hard higher theoretical performance numbers for a while now, the key word being theoretical.

Take the 5870 @ 2.72 tflops vs the GTX 480's measly 1.35 tflops for example. Doesn't exactly paint an accurate picture of a cards performance.


----------



## NeoXF (Jan 21, 2013)

Nope, but thanks anyway. If this will really be ~$900, I can't imagine it will get anything but murdered in the price/performance aspect (at least), by a Radeon HD 8970... which will also bring a little more something than just "beefier specs"... AMD's new found driver love for GCN (as opposed to launching barely tapped technology, like it was the case for HD 7000 for a long time... might still be too... que Catalyst 13.2b "Latancy fix" drivers here).


Edit: Neah... Honestly, they should probably name it GTX 685 (Ti) or something...


----------



## cadaveca (Jan 21, 2013)

I am now waiting for the news headline "Crash of the Titans, Nvidia Hardware Failure Rampant!".


----------



## Crap Daddy (Jan 21, 2013)

T4C Fantasy said:


> well the 7970 ghz edition still has more tflops than the k20x lol



Who needs teraflops when this is supposed to bring megaframerates.


----------



## T4C Fantasy (Jan 21, 2013)

Fluffmeister said:


> AMD cards have hard higher theoretical performance numbers for a while now, the key word being theoretical.
> 
> Take the 5870 @ 2.72 tflops vs the GTX 480's measly 1.35 tflops for example. Doesn't exactly paint an accurate picture of a cards performance.



the 5870 could probably compute better in boinc milkyway@home  without a problem


----------



## erixx (Jan 21, 2013)

Ok, so this is the generation that I will pass. (As always since TNT)


----------



## Fluffmeister (Jan 21, 2013)

T4C Fantasy said:


> the 5870 could probably compute better in boinc milkyway@home  without a problem



Well that's conclusive then.


----------



## the54thvoid (Jan 21, 2013)

Bytales said:


> Please spare me the battling disesase, curing cancer Phrase.
> There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
> One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.
> 
> ...



At massive risk of an infraction... F*ck off buddy - you're not welcome here.


----------



## Max Mojo (Jan 21, 2013)

Waiting for GK110 since autumn already, when rumours quietly signalized GTX780. Hopefully coming in march in time for to be fed with GTA V. 
Time to leave GTX680SLI and pimp up my gaming rig? Not sure.  On the other hand GTX680 is a great card in single mode, but tends to be a bit warm in SLI, no need to heat my gaming chamber in winter. 
Yes, I think I will not be strong enough to resist this card - given that, it doubles performance.  Early speculation talked about only 30% increase what would be a no-go.


Just found this: 

GeForce Titan, GeForce Titanium, GeForce GTX 780 Ti Specifications:

    Kepler GPU with 2688 CUDA Cores
    6GB GDDR5 memory
    Core Clock: 732 MHz
    Memory Clock: 5200 MHz
    MSRP: 899 USD

http://videocardz.com/39143/geforce-titanium-the-allmighty-gk110-based-graphics-card


6GB video ram would be great


----------



## TheGuruStud (Jan 21, 2013)

1k with tax? LOL  Not in a million years.

OC two 670s and spank this stupid card back to nonexistance.

And who are buying these as compute cards? They must be higher than a kite.


----------



## badtaylorx (Jan 21, 2013)

why not let Asus or MSI put a better cooling option on it.....

or better yet, let 'em redesign the pcb and power delivery....



the54thvoid said:


> At massive risk of an infraction... F*ck off buddy - you're not welcome here.




speak for yourself....im all in for a free and open net....

do i agree with all he says....no

is there SOME truth to what he says.... yes


i no more want him censored than your ability to tell him to fuck off....


----------



## badtaylorx (Jan 21, 2013)

pleASE delete DBBL POST


----------



## tastegw (Jan 21, 2013)

Hmm...


----------



## HumanSmoke (Jan 21, 2013)

the54thvoid said:


> Well, if this is what Kepler was 'meant to be' then it explains a lot.  I can't see AMD coming close to this but then again, if the Sweclockers info is correct, that price for a single gpu is terrible


But understandable.
What do you do when you want high profile PR (reviews) from consumer hardware, but would rather sell the die packaged as a Quadro or Tesla board for a whole lot more?

The pricing would likely cover any lack of actual availability, while ensuring (as a reference single GPU SKU) that it sits comfortably atop the benchmark charts even after the next round of releases. Kind of takes the pressure off the GK114 I would have thought. 

So yeah, crap price, but wasn't it always destined to be if the GK110 made it to GeForce branding?


cadaveca said:


> I am now waiting for the news headline "Crash of the Titans, Nvidia Hardware Failure Rampant!".


Nice flamebait.


----------



## T4C Fantasy (Jan 21, 2013)

tastegw said:


> http://videocardz.com/images/2013/01/GeForce-GTX-780-Ti.jpg
> 
> Hmm...



speculation, I added 780ti to the db based on naming so either they copied me or they had the same thought


----------



## cadaveca (Jan 21, 2013)

HumanSmoke said:


> Nice flamebait.



:shadedshu

It's not flamebait, that's what immediately came to mind when I read the article and saw the name Titan.



Anyway...


February sounds good.


----------



## TheoneandonlyMrK (Jan 21, 2013)

Max Mojo said:


> Waiting for GK110 since autumn already, when rumours quietly signalized GTX780. Hopefully coming in march in time for to be fed with GTA V.
> Time to leave GTX680SLI and pimp up my gaming rig? Not sure.  On the other hand GTX680 is a great card in single mode, but tends to be a bit warm in SLI, no need to heat my gaming chamber in winter.
> Yes, I think I will not be strong enough to resist this card - given that, it doubles performance.  Early speculation talked about only 30% increase what would be a no-go.
> 
> ...



Great sounding card indeed.bit dear for me though but it wont double the gameing performance of a 680 thats just wishfull thinking.
As for amd, they are well aware of this chip which imho should mean an apprpriate re action at some point.
The next few months will make good reading 

also whats the compute pro version gpu clocked at?


----------



## 20mmrain (Jan 21, 2013)

*1st comment:* Remember this is a very early report. We had reports about the GTX 680 costing 800 Dollars early on too. - So I wouldn't get to worried yet.

*2nd comment:* Also AMD has not tipped their hand on how the HD8900 series will perform. (And I don't mean the HD8000M series which is a refresh. I mean the real HD8900 series) So if AMD has something to compete with this card..... Nvidia will not price this card at 899 Dollars.

*3rd Comment:* Even if AMD can not compete straight out the gate.... we have seen how that works well for us being the end consumer. That would mean we may see another GTX 280/285 vs HD4870 price battle..... Which would be great news for all of us. If you ask me.... Nvidia can keep their over priced card that can perform 25% better then the HD8970. I would get 4 HD8970's for the price of two Titans and have much more fun.

All speculation until we hear some real proof.... this article is not enough for me yet.


----------



## Novulux (Jan 21, 2013)

I suppose this would make my 600w PSU beg for mercy when overclocked?


----------



## erocker (Jan 21, 2013)

I'd love to have it! Not dishing out $900 bucks for console ports though.


----------



## renz496 (Jan 21, 2013)

theoneandonlymrk said:


> Great sounding card indeed.bit dear for me though but it wont double the gameing performance of a 680 thats just wishfull thinking.
> As for amd, they are well aware of this chip which imho should mean an apprpriate re action at some point.
> The next few months will make good reading
> 
> also whats the compute pro version gpu clocked at?



that spec is exactly the same as tesla part. detail about tesla k20 and k20x here:

http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last

we know that the compute version always clock lower than it's geforce counter part so this Geforce 'Titan' could clock higher for the final revision. but nvidia might want to keep the clock lower just like the tesla cards to keep the TDP at 235w.


----------



## Hilux SSRG (Jan 21, 2013)

Max Mojo said:


> Hopefully coming in march in time for to be fed with GTA V.



Not sure if you're gtx680sli is up to snuff, I recommend you upgrade to the gtx/titan780.


----------



## Gadgety (Jan 21, 2013)

So I want this for rendering with Octane render, or possibly V-ray RT, but I have a sneaky feeling Nvidia is going to put in some sort of artificial hindrance, how would they else get to sell their absurdly expensive Maximus range... If there are no snags, I'll get three or four of the GeForce Titans.


----------



## MxPhenom 216 (Jan 21, 2013)

Easy Rhino said:


> awesome. so now instead of running the most demanding PC games on max settings at 100FPS we can run them at 200 FPS.



you sound frazzled.


----------



## dank1983man420 (Jan 21, 2013)

Bytales said:


> Please spare me the battling disesase, curing cancer Phrase.
> There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
> One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.
> 
> ...



I am so enlightened now.  Why am I folding when I can just drink my own pee and cure every aliment ?  


As for the graphics card, sell it at US $599(not 800) or lower and this is an instant winner(for hi resolution gamers and folders alike...).  I am only worried about heat like usual....


----------



## qubit (Jan 21, 2013)

Easy Rhino said:


> awesome. so now instead of running the most demanding PC games on max settings at 100FPS we can run them at 200 FPS.



Trust me it'll look smoother. 



Bytales said:


> *Please spare me the battling disesase, curing cancer Phrase.
> There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
> One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.*
> 
> ...



Ah, yeah right. We've got "cures" for all these things, especially the godawful cancer. Sure we do.

Man never went to the moon either and everything is a conspiracy.


----------



## LAN_deRf_HA (Jan 21, 2013)

So the actual product name will be Titan? Then I guess we won't get a real 700 series chip until the end of the year. This product seems to be going from the wonder chip game changer, polished for months and months, to totally random one off high end part that changes nothing for nobody? The profits nvidia is making from kepler must be insane. Imagine if the 7000 series had been good this thing would be only $500 and the 680(660 ti) would be $300 or less. Talk about pulling your punches.


----------



## Lionheart (Jan 21, 2013)

Bytales said:


> Please spare me the battling disesase, curing cancer Phrase.
> There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
> One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.
> 
> ...



You forgot to mention Hemp oil 

Ignore the close minded sheeple, take you about a decade to get through to them


----------



## N3M3515 (Jan 21, 2013)

Max Mojo said:


> Early speculation talked about only 30% increase what would be a no-go.



Actually, acording to techpowerup charts, if GTX 780 is 85% of GTX 690, then GTX 780 would be to GTX 680:

45% faster at 2560x1600
29% faster at 1920x1080
21% faster at 1680x1050
6%   faster at 1280x800

Avg. of 25% faster.

AMD needs something 25% avg faster than 7970 Ghz Ed., to keep the marginal lead they have.


----------



## sergionography (Jan 21, 2013)

Max Mojo said:


> Waiting for GK110 since autumn already, when rumours quietly signalized GTX780. Hopefully coming in march in time for to be fed with GTA V.
> Time to leave GTX680SLI and pimp up my gaming rig? Not sure.  On the other hand GTX680 is a great card in single mode, but tends to be a bit warm in SLI, no need to heat my gaming chamber in winter.
> Yes, I think I will not be strong enough to resist this card - given that, it doubles performance.  Early speculation talked about only 30% increase what would be a no-go.
> 
> ...



if these specs end up the case then it would be such a fail for nvidia, remember gk104 is clocked around 1100 mhz and noway a huge chip like gk110 can clock that high
the max i expect the clock to be is around 800-900 if we are to be optimistic, 732 is the tesla version which is clocked lower because tesla have to be 24hour operation guaranteed
which geforce cards dont require such warantee.
but if 732 is the clock speed then nvidia is in for trouble, because if we were to do calculations
gtx680 has 1536cores at around 1100mhz   1536 x 1100= 1689600
gk110 has 2688 at 732mhz                       2688 x 732  = 1967616
                                                           1689600/1967616 = so around .85 or 85% which is 15% extra theoretical power over gtx680 but offcourse add another 10% for the added memory bandwidth benefit. so to make a chip twice the size of gtx 680 for 15-25% extra performance is very meh, which reallly makes me doubt its a 732mhz  part otherwise nvidia is much better off making a part closer to 2000 cores with the higher clockspeed advantage, thats definitly the smarter way to get 30% performance and be able to sell it at good prices. 

and most likely this is what amd is doing next round, refining gcn for better efficiency to pack more cores in the same power envelope while maintaining the clock speed advantage


----------



## qubit (Jan 21, 2013)

@sergionography

Agreed, clock speed sounds way too low. I think this chip has around 7b transistors in it, so I can imagine that getting a ghz out of it will be challenging. No doubt it would be more comfortable on a smaller process technology.


----------



## acerace (Jan 21, 2013)

Bytales said:


> Please spare me the battling disesase, curing cancer Phrase.
> There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
> One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.
> 
> ...



I love you. 



the54thvoid said:


> At massive risk of an infraction... F*ck off buddy - you're not welcome here.



I thought TPU is a friendly site. Guess it changed now when other people has different opinions than you. Well.


----------



## the54thvoid (Jan 21, 2013)

acerace said:


> I thought TPU is a friendly site. Guess it changed now when other people has different opinions than you. Well.



All I can say is read the initial post by Bytales.  It is confrontational, conspiratorial and quite insulting to crunchers (who do what they do to help scientific research for a good cause).

I love the blatant hypocrisy people exhibit online.  If the guy has fans here, fair enough.  You can all live in the conspiracy theory bunker and think the world is out to get you.  But we don't all have to hold hands and hug. 

FTR, what the does the Titan Supercomputer do? Crunch numbers for science.  Using the HPC variant of the very card we are talking about. 

EDIT:

Oh yeah, why don't more people come help out?

http://www.worldcommunitygrid.org/about_us/viewGridComputingBasics.do

http://www.techpowerup.com/forums/forumdisplay.php?f=68

http://www.techpowerup.com/forums/forumdisplay.php?f=67


----------



## [H]@RD5TUFF (Jan 22, 2013)

kind of a lame name imo


----------



## sergionography (Jan 22, 2013)

qubit said:


> @sergionography
> 
> Agreed, clock speed sounds way too low. I think this chip has around 7b transistors in it, so I can imagine that getting a ghz out of it will be challenging. No doubt it would be more comfortable on a smaller process technology.



well who knows maybe they will do something similar to the 200 series for the 700 series and use 2 process nodes next round, with gk110 being a lower clocked version then when 20nm comes out with the low capacity they will only shrink the top end part and market it as gtx785 with higher clockspeeds and lower tdp and sell it for an arm and a leg like the gtx285 did back in the day. that would be a decent strategy which will buy them time for maxwell and a shrink is always a better way to go than a shrink and new architecture from a technical engineering standpoint. especialy with nvidia they arent the best in moving to new nodes


----------



## HammerON (Jan 22, 2013)

This isn't the thread to discuss whether or not one should crunch or fold. Please take it to PM's.
Carry on

Edit: On topic - I will be enjoying watching some of our extreme over clockers have fun with these cards while I sit this one out. Too expensive for me.


----------



## KainXS (Jan 22, 2013)

I will wait for more information about the correct specifications, since it seems they based the clocks on the K20x clocks which is not correct as this card will not be a passive telsa card, nvidia usually sets the clocks back on their telsa cards by about 20%-25% recently, I would guess the core clock on the geforce version to be about 900 - 950mhz.

but still at 900 dollars, . . . . . . not a chance I'm gonna buy it.


----------



## ViperXTR (Jan 22, 2013)

ASUS GeForce _Titan_ 780 6GB GDDR5 _MARS IV_ 

'__'


----------



## Easy Rhino (Jan 22, 2013)

MxPhenom 216 said:


> you sound frazzled.



cool story.


----------



## NeoXF (Jan 22, 2013)

Only way I see this card having revelancy to such a name is... if it's 4K-proof to AT LEAST a handful of modern titles... since I hate multi-monitor gaming setups with a passsion and even the puny-ish 2560x1440/1600 are no way near as common as they should be... Nor is the hardware or programing skill of most game studios up to snuff.


----------



## alwayssts (Jan 22, 2013)

N3M3515 said:


> Actually, acording to techpowerup charts, if GTX 780 is 85% of GTX 690, then GTX 780 would be to GTX 680:
> 
> 45% faster at 2560x1600
> 29% faster at 1920x1080
> ...



More-or-less agree.

Also, I usually work these things backwards.

I personally think 7ghz/384-bit is a safe bet for 'a' gk110 card.  Might not be a clock of one released, but it's a starting point of what to expect within 300w.  

Working backwards from the 680, a 14 smx unit card with 384-bit at the same clocks as 680 would require that exact amount of bandwidth.

( 2688/1536 = 1.75/1.5 = 1.166_ x 6000 = 7000).  

Now, the optimal (most efficient) amount of units (including sfu) in gaming for 48 ROPs is somewhere around >2800 and <2900.  This would have 3136.  AMD may have 2560, so that's kind of a wash.

Then it's obviously about voltage and clock potential/efficiency within a tdp (probably 300w).  AMD may clock their product stock closer to nvidias max clock within the tdp, choosing rather for their max clock at 300w to be closer to the potential of the 28nm process (1200-1300mhz).  nvidia's clock choice may be more power efficient, say if they scale from ~975 to 1100+mhz.  AMD's choice may be more die size/cost efficient.

Point is, end of the day, if one has ~10% more usable units but more bloat, and one clocks ~10% higher, what is the better part?  Does it really matter?  They should be relatively close.


----------



## HumanSmoke (Jan 22, 2013)

alwayssts said:


> I personally think 7ghz/384-bit is a safe bet for 'a' gk110 card.  ...Working backwards from the 680, a 14 smx unit card


If you're making assumptions that the K20X cut-and-paste specs re memory are wrong, then might the SMX count be wrong also?
It isn't beyond the realms of possibility that the GeForce version has a full 15SMX. Tesla and Quadro generally have more functionality fused off then GF- presumably to fuse off the out-of-spec logic blocks and to reduce power requirement. A GeForce card probably wont be under the same restraint. Also not unheard of that TSMC's process might have improved and/or a revision from the first tranche of wafers might have taken place. If the original ~20000 GPUs going to HPC deployment are 87-93% functional then I'd assume that there must be a percentage of fully functional chips


----------



## Melvis (Jan 22, 2013)

Bla bla bla who cares, what i have now will last the next 7yrs anyway, not tlike we cant all play console ports anyway right? lol


----------



## HumanSmoke (Jan 22, 2013)

Melvis said:


> Bla bla bla who cares, what i have now will last the next 7yrs anyway, not tlike we cant all play console ports anyway right? lol


You mean the games that will accompany the DX 11 consoles due out ?


----------



## Melvis (Jan 22, 2013)

HumanSmoke said:


> You mean the games that will accompany the DX 11 consoles due out ?



Correct!!


----------



## alwayssts (Jan 22, 2013)

NeoXF said:


> Only way I see this card having revelancy to such a name is... if it's 4K-proof to AT LEAST a handful of modern titles... since I hate multi-monitor gaming setups with a passsion and even the puny-ish 2560x1440/1600 are no way near as common as they should be... Nor is the hardware or programing skill of most game studios up to snuff.



I'm with you on all that, but be realistic.

It's conceivable at the very edge this gen, but the big push will be on 20nm both because of the timing of the displays reaching a more tangible market as well as process abilities of tsmc (not to mention the potential need for more bw and/or more dense buffers without building monstrosities that will likely come between now and then).

Figure 4k is 4x 1080p.  

I figure 20nm will bring similar designs to gk110/8900 aimed at the sweet-spot market with their shiny new 4k displays in late 2014 to 2015.  That is to say efficient and 48 ROPs...obviously on more realistic size/yielding silicon and in consumer-friendly power envelopes.  If that were roughly 2688 units (12 nvidia smx = 2688 w sfu, amd 42 cu = 2688) at ~1300mhz, it would be ~4x something like a 7850 (1024 x 860mhz), the baseline to play most titles at 1080p, and likely not changing much given the new rumored console specs.  

Considering the process shrink should bring roughly 2x density, ~20-30% clock hikes at similar voltage, and gddr6 (and/or 4Gb GRAM) if not some other tech may rear it's head by that time, it seems a realistic trajectory.   See clock/power skew of 28nm in previous post but note TSMC will lower the voltage aim on 20nm...1.264v ain't gonna be refuse anymore certainly to AMD's disappointment.  The process will likely wimper out around where most designs hover because of efficiency, 1.15-1.175v (blame a united process with an eye focused on mobile SoCs, ).  That means potentially ~1400-1500mhz, minus ~10% for stock skus...or around 1300mhz give or take.

Speculative maths, to be sure.  But realistic.


----------



## The Von Matrices (Jan 22, 2013)

I'm confused.  I always thought that the "Ti" suffix was short for "Titan."  If "Ti" isn't short for "Titan", then what is its meaning?


----------



## Optimis0r (Jan 22, 2013)

The Von Matrices said:


> I'm confused.  I always thought that the "Ti" suffix was short for "Titan."  If "Ti" isn't short for "Titan", then what is its meaning?



Titanium according to Nvidia


----------



## Samskip (Jan 22, 2013)

20mmrain said:


> If you ask me.... Nvidia can keep their over priced card that can perform 25% better then the HD8970. I would get 4 HD8970's for the price of two Titans and have much more fun.
> 
> All speculation until we hear some real proof.... this article is not enough for me yet.




I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.


----------



## Fluffmeister (Jan 22, 2013)

Samskip said:


> I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
> But for pure benchmarking quad crossfire could give better results i guess.



AMD haven't exactly been generous with prices either, the 7970 was massively overpriced when it launched too.


----------



## Prima.Vera (Jan 22, 2013)

900 bucks for console ports...! Cool story bro!


----------



## Fluffmeister (Jan 22, 2013)

Prima.Vera said:


> 900 bucks for console ports...! Cool story bro!



That's a pretty standard response here, it's only natural for people to get defensive when their card drops down a notch.


----------



## Zubasa (Jan 22, 2013)

The Von Matrices said:


> I'm confused.  I always thought that the "Ti" suffix was short for "Titan."  If "Ti" isn't short for "Titan", then what is its meaning?


Ti = Titanium
Although titanium is named after the Titans.


----------



## blibba (Jan 22, 2013)

N3M3515 said:


> Actually, acording to techpowerup charts, if GTX 780 is 85% of GTX 690, then GTX 780 would be to GTX 680:
> 
> 45% faster at 2560x1600
> 29% faster at 1920x1080
> ...




Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.


----------



## N3M3515 (Jan 22, 2013)

blibba said:


> Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.



Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.


----------



## bogami (Jan 22, 2013)

Finally, the bastards ate profit from KG104 and decided to offer a significant delay product whose price was overpaid by 100%. Now we have pulled kepler life design to reduce GPU processor production to 18 microns. Of course, we will also pay the same patents this year and I hope that pig burst of obesity and gluttony. We know that the driver for the Quadro series DirectX bypasses the problem of efficiency and thereby obtained only from 30% to 100% of the CPU unčinkovitosti and we will have to wait for the Maxwell GPU, which will have the advantage as built. I hope that they will strainers saliva of licensing and contractual profits, just like us for good hardware: P


----------



## blibba (Jan 22, 2013)

N3M3515 said:


> Your point being?
> BTW, my calculations are pretty logical, or explain me otherwise.



My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.


----------



## Crap Daddy (Jan 22, 2013)

N3M3515 said:


> Your point being?
> BTW, my calculations are pretty logical, or explain me otherwise.



Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p







Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.


----------



## tokyoduong (Jan 22, 2013)

crap daddy said:


> here's a better comparison for you, gtx690 being 53% faster than the ge at 1200p
> 
> http://tpucdn.com/reviews/amd/hd_7970_ghz_edition/images/perfrel_1920.gif
> 
> now, math is not my strong point but i reckon going by the presumed 85% performance of the titan (ium) compared to gtx690, this gk110 based card would be 45% faster than the 7970ge.



153 * .85 = 1.30


----------



## blibba (Jan 22, 2013)

Crap Daddy said:


> Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p



On average, where the average includes many games where they perform very closely due to bottlenecks.



Crap Daddy said:


> Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.



So no.


----------



## DarkOCean (Jan 22, 2013)

And if 8970 is at least 20% faster than 7970ge then titan would be  ~8% faster than 8970.


----------



## erocker (Jan 22, 2013)

I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.


----------



## N3M3515 (Jan 22, 2013)

blibba said:


> My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.



Well, 85% of a GTX 690 is nowhere near +50% diff man...


----------



## N3M3515 (Jan 22, 2013)

Crap Daddy said:


> Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p
> 
> http://tpucdn.com/reviews/AMD/HD_7970_GHz_Edition/images/perfrel_1920.gif
> 
> Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.



I'm sorry man, but 153*85% = 130, 30% faster than 7970 GE, at 1920x1080


----------



## Easy Rhino (Jan 22, 2013)

erocker said:


> I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?
> 
> I hope that this price point doesn't become the standard for high end single GPU cards.



I have an even better option for you. Don't buy a TV and instead spend $300 on a GPU and $600 on games.


----------



## blibba (Jan 22, 2013)

N3M3515 said:


> Well, 85% of a GTX 690 is nowhere near +50% diff man...



You've clearly missed my point, but what you've written here is wrong anyway.

A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.


----------



## N3M3515 (Jan 22, 2013)

blibba said:


> You've clearly missed my point, but what you've written here is wrong anyway.
> 
> A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.



GTX 680 IS NOT 50% of a GTX 690 omg....


----------



## Crap Daddy (Jan 22, 2013)

erocker said:


> I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?
> 
> I hope that this price point doesn't become the standard for high end single GPU cards.



I kinda feel this is year is a turning point in the GPU business and in the broader picture the PC business. Low end will disappear replaced by integrated graphics, mid range cards will be hard to sell over 200$ and there will be an enthusiast niche covered by the likes of Titanium and whatever AMD comes up with. The rest of us will game on consoles, tablets, laptops and smartphones. It's the same as with the music industry, there are a handful of people who still buy vinyl records, stare at cover and have expensive Hi-Fi equipment, they like to clean their records and upgrade the equipment while the rest are enjoying low-fi MP3s played through portable devices which they will throw away gladly when a new gadget will come along. Mind you, these are priced in the vicinity of a few pizzas or a good night out. 

There will be a market for Titanium same as there's a market for outrageously priced DACs and headphones. Crysis 3 (if one's interested) can be played on a console. No big deal, single player, couple of days of fun then forget about it. Life goes on.


----------



## TheoneandonlyMrK (Jan 22, 2013)

blibba said:


> A 680 is 50% of a 690



what nooo, they don't scale that well mate and deffinately not in every title, refer back to your own argument on avg framerates for the fail in that quote

and anyway your arguing against what one guys opinion of a guestimated avg performance chart Might be (id guess similar to him imho), he may be right you may be, the arguings pointless either way


----------



## N3M3515 (Jan 22, 2013)

N3M3515 said:


> GTX 680 IS NOT 50% of a GTX 690 omg....



98% vs 153% thats 64%, not 50%


----------



## blibba (Jan 22, 2013)

N3M3515 said:


> GTX 680 IS NOT 50% of a GTX 690 omg....



It is in the sense that a 780 will be 85% of a 690. I feel like you have understood none of what I have posted here.


----------



## TheoneandonlyMrK (Jan 22, 2013)

blibba said:


> It is in the sense that a 780 will be 85% of a 690. I feel like you have understood none of what I have posted here.



yeh but your wrong and some of us do understand

see crap dadys chart prior in this thread ,85% of a 690's performance would hit 130% on that chart or, 30% fasted then the 7970 ,simples

also I may have been mislead but I thought Nvidia had put more double precision units in the gk110 ,are these counted amongst the 2880 shaders despite them being specialist units? ,not trying to spark a row im just interested and asking?


----------



## blibba (Jan 22, 2013)

theoneandonlymrk said:


> yeh but your wrong and some of us do understand
> 
> see crap dadys chart prior in this thread ,85% of a 690's performance would hit 130% on that chart or, 30% fasted then the 7970 ,simples




The chart is utilised entirely inappropriately. The 85% stat is from Nvidia. If you asked them to put a 680 on the same scale, it'd be 55% tops.

In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).


----------



## TheoneandonlyMrK (Jan 22, 2013)

blibba said:


> The chart is utilised entirely inappropriately. The 85% stat is from Nvidia. If you asked them to put a 680 on the same scale, it'd be 55% tops.
> 
> In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).



if I asked Amd to do that chart they would have beat the 690 with a 7970 but those two charts would be useless to me, an unbiased customer 

and all that your waffling can be restated via an Amd biased stance ie some games favour Amd gfx sooooo, that's why we read Tpu reviews ,that charts sound in my eyes bro ,wiz did it....  

im out anyway dude your opinions all good ,I am an optimist too.


----------



## N3M3515 (Jan 22, 2013)

blibba said:


> The chart is utilised entirely inappropriately. The 85% stat is from Nvidia. If you asked them to put a 680 on the same scale, it'd be 55% tops.
> 
> In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).









That's 66% right there.


----------



## blibba (Jan 22, 2013)

N3M3515 said:


> http://img651.imageshack.us/img651/5480/techpowerup.png
> 
> That's 66% right there.



TPU FPS charts are not even remotely relevant to the discussion we are having. I have explained why this is the case in multiple previous posts in this thread and will not do so again.


----------



## N3M3515 (Jan 22, 2013)

blibba said:


> TPU FPS charts are not even remotely relevant to the discussion we are having. I have explained why this is the case in multiple previous posts in this thread and will not do so again.



I'm talking about this:


Naito said:


> Edit 2: Checked original article. Claims 85% of the performance of the GTX 690. Possibility of limited edition card? (when considering naming): "Partner companies must comply with Nvidias reference design to the letter and may not even put their stickers on graphics cards."
> Last edited by Naito; Jan 21, 2013 at 08:54 AM.


----------



## HumanSmoke (Jan 22, 2013)

theoneandonlymrk said:


> also I may have been mislead but I thought Nvidia had put more double precision units in the gk110 ,are these counted amongst the 2880 shaders despite them being specialist units?


The architecture is somewhat different to the GK104 where the 8 FP64 units are distinct (and operate at 1:1 FP64:FP32) from the 1536 shaders. The GK110 has 1920 FP32-only shaders, and 960 FP32/64 capable (GK110 operates on 1/3 double precision rate), so 2880 is the maximum shader/CUDA core/stream processor count of the die.


----------



## blibba (Jan 22, 2013)

N3M3515 said:


> I'm talking about this:



So am I. TPU average FPS graphs are not remotely relevant to this. I have repeatedly explained this. I will no unsubscribe from this thread, as I feel it is going nowhere. I advise that you read Tech Report's review when the card is launched.


----------



## N3M3515 (Jan 23, 2013)

Well, we'll have to agree to disagree


----------



## jihadjoe (Jan 23, 2013)

On the bright side, the last time Nvidia had a $800 card was the 8800GTX.
A year later they followed up with the 8800GT that offered 90% of the performance for 1/2 of the price.

Just praying that lightning can, and does strike twice.


----------



## TSX420J (Jan 23, 2013)

Been waiting since last year for this. Was really let down by the dumbed down 6xx series. Just finished putting together 90% of my PC and this is the final piece of the puzzle. Can't wait. Hopefully it wont let me down.


----------



## TAZ007 (Jan 23, 2013)

*Just my humble opinion *

Having read this thread start to finish i have to wonder, i live and die by one rule, if it aint broke, then dont fix it, i wont buy a 780TI or equivalent unless a game comes out that my PC cant play, some of you spend all that money on having the latest and greatest yet all you seem to do is debate on forums and benchmark all day, a tad overkill dont ya think, all that money just for bragging rights, well more money than sense i think, i can play Farcry 3 on my 560Ti 448 on ultra settings, perfectly playable too with 30 to 50 fps, can play BF3 on high settings and get 50 to 80 FPS, tho this game and only this game in my collection pushes my 1.25 memory to its limit, hence why i play on high settings and not ultra, no this dont apply to everyone, just the die hards, the ones with 680 sli set ups getting all excited cos they cant wait to get the hands on a new 780 even tho the dont really need a 780 at this moment in time, well dont moan about the price, cos it people like you that make the price so high in the first place, Nvida and AMD know this, and if i was them id be screwing you for all the money i could, cos there is a market for it, and really i should not moan, cos as a result i get to you old 680 a lot less than what you paid for them, that the only numbers in interested in, £££ ching ching 

Now for the rest of us that have more sense than money that dont feel sad when our FPS drop below 100 and are quite happy playing games than worrying the toss about how much faster a unreleased GPU is going to be or how high our 3Dmark score is going to be, well just like to say hello to you all, im Taz, and if you like to shoot the shit outta me on BF3 then look out for T7BSY,  but if you a sad fk that uses an aim bot and other hacks the GET A LIFE!

Anyway in short, i got better things to do than argue the toss over something that is yet to be released, *Bytales* Intresting read if nothing else . *QUBIT* if you think that 200 FPS will look smother than 100 you just kidding yourself to justify your outlay, most users only have 60MHz screens and the eye cant see beyond that anyway and thats a fact, cos if that was the case then we would only ever see half the picture when watching normal telly, if you have a 120MHz screen you might have a point tho, and even then most would not be able to tell the difference


----------



## qubit (Jan 23, 2013)

@TAZ007

Judging by your _very first post_ on TPU, it's clear that you just like a good flamebait rant, rather than presenting a coherent argument.

Therefore, I won't waste my time explaining to you where you've got it all so wrong about frame rates. I can't believe you replied all that to what was just a humorous remark to another member, lol.


----------



## ALMOSTunseen (Jan 23, 2013)

TAZ007 said:


> *QUBIT* if you think that 200 FPS will look smother than 100 you just kidding yourself to justify your outlay, most users only have 60MHz screens and the eye cant see beyond that anyway and thats a fact, cos if that was the case then we would only ever see half the picture when watching normal telly, if you have a 120MHz screen you might have a point tho, and even then most would not be able to tell the difference


Ok, let me just put in my 5 cents here. Most would rather have 200 fps, over 100fps because it gives you "security", with frame drops, and recording. With frames drops, imagine you're at 100fps, and you drop 50fps, you're going to notice it. If you're at 200 fps, and you drop 50fps, you're not going to notice it. And if you're recording, say with fraps. You lock it at 60fps. The higher your maximum fps is, the smoother the recording, AND your gameplay. I have issue's where my recording is smooth, but my gameplay, gets quite laggy.


----------



## tokyoduong (Jan 23, 2013)

blibba said:


> You've clearly missed my point, but what you've written here is wrong anyway.
> 
> A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.



This is as misleading as nvidia's slides


----------



## TAZ007 (Jan 23, 2013)

qubit said:


> @TAZ007
> 
> Judging by your _very first post_ on TPU, it's clear that you just like a good flamebait rant, rather than presenting a coherent argument.
> 
> Therefore, I won't waste my time explaining to you where you've got it all so wrong about frame rates. I can't believe you replied all that to what was just a humorous remark to another member, lol.



Not at all, but i dont find arguing the toss about about numbers and percentages on something that nobody can possibly know yet pretty pointless, as far as frame rate go, dont take it personnel, i just dont agree with you simple as, i had a 670, i played with and with out vsync and seen not difference at all, hence i returned the 670 and kept the card i got now, anything from 60 FPS to whatever number you choose is wasted in, or on my eyes if you like, i would love to setup six screens and have someone like play high or lower with me 



ALMOSTunseen said:


> Ok, let me just put in my 5 cents here. Most would rather have 200 fps, over 100fps because it gives you "security", with frame drops, and recording. With frames drops, imagine you're at 100fps, and you drop 50fps, you're going to notice it. If you're at 200 fps, and you drop 50fps, you're not going to notice it. And if you're recording, say with fraps. You lock it at 60fps. The higher your maximum fps is, the smoother the recording, AND your gameplay. I have issue's where my recording is smooth, but my gameplay, gets quite laggy.



Security? lol, have you read all the post on here then? not 100% sure what you mean but im guessing that you mean the GPU you have well last you longer in terms of new games coming out that might in time kill the FPS meaning its now time for a new card?

So are you saying that all those that have 680sli setups are now insecure because the 780Ti is due out? i think not, i find it hard to believe that there is a game out there that would see off two 680's in sli? even with 3 monitor setup a single 680/7970 can play play all games out there

And last but not least, you reckon i will notice a drop from 200 FPS to 50, but not 100FPS to 50 lol, I would not notice either, the brain cant count what the eye cant see, thats not to say you might not feel it, but that wont apply to many, ask any console user, they dont worry about FPS and they still enjoy the game they play just as much as you or I

Only thing i would notice is stuttering when frame drop to 5 to 30 fps frap has nothing to do with what we are talking about, anything above 60 FPS is the sweet spot, but again, its just my opinion, not saying either of you are right, or wrong, just saying does not apply to most of us

Edit; sorry i did misread what you was trying to saying, so to a point i agree, but here is an example, i can play FARCRY 3 on ultra settings, NO AA and SSAO and @ 1920 x 1200 and get between 25 to 50 FPS and this is smooth as, Vram has not gone over 1GB, now on BF3 even on high settings i get 50 to 80 fps, but still in multiplayer i can run out off vram which can cause me lag or drop in frames that cause stuttering, but thats not down to gpu power, thats due to lack of Vram, which hits FPS and causes it to become not so smooth, and thats the only reason you would notice, to put it another way, two 560ti overclocked GPU to score 9500 in 3Dmark11, so as good as a single 680 or 7970, but even tho FPS are in the hundreds playing BF3 its still not smooth, get lag and frame drops due to the lack of Vram, so imo its not all about FPS that equal smooth game play, its all about getting the right balance to limit any bottle necks that you will have at some point!


----------



## Prima.Vera (Jan 23, 2013)

Just to add a note here, anything about 30fps is perfect for me in games. And from experience I can tell you that anything above 60fps it makes no difference, I cannot feel the image being more smooth or something. The ones that claim this are just lying to themselves...


----------



## Calin Banc (Jan 23, 2013)

TAZ007 said:


> And last but not least, you reckon i will notice a drop from 200 FPS to 50, but not 100FPS to 50 lol,


He was talking about a drop of 50FPS from 100 (to 50) or from 200 (to 150 in this case) is not the same. 



Prima.Vera said:


> Just to add a note here, anything about 30fps is perfect for me in games. And from experience I can tell you that anything above 60fps it makes no difference, I cannot feel the image being more smooth or something. The ones that claim this are just lying to themselves...



The FPS can move a lot during a game session. If you want the best possible experience, then you cap that to ~ 60FPS for 60hz monitor, or above for 120hz. For that, you'll also need a minimum of 60FPS, which is much harder to get than an average 60fps.


----------



## Slizzo (Jan 23, 2013)

jihadjoe said:


> On the bright side, the last time Nvidia had a $800 card was the 8800GTX.
> A year later they followed up with the 8800GT that offered 90% of the performance for 1/2 of the price.
> 
> Just praying that lightning can, and does strike twice.



LESS than half the price. I bought an 8800GT on launch day for $300 from CompUSA.


----------



## TheHunter (Jan 23, 2013)

I dont buy that 6gb vram size, the one who started spreading this possible GK110 rumor somehow pulled all that info yes from Tesla K20X with 6gb ram. Also with same clocks which doesnt make any real sense..

I think something like 850-950mhz with 3gb seems more plausible and I hope its a real GK110 with all bells and whistles, not just another GK104 with more cores on it.. 


Also 800$ is a bit to much for me, but then again its some random number xD

I would go for 770GTX, i mean Titan. Hopefully for ~ 500$ max.


----------



## TAZ007 (Jan 23, 2013)

Calin Banc said:


> He was talking about a drop of 50FPS from 100 (to 50) or from 200 (to 150 in this case) is not the same.
> 
> 
> 
> The FPS can move a lot during a game session. If you want the best possible experience, then you cap that to ~ 60FPS for 60hz monitor, or above for 120hz. For that, you'll also need a minimum of 60FPS, which is much harder to get than an average 60fps.



This is what i do when i have a card thats has more power than i need, but still if i dont run fraps whist gaming there is no way on earth i cant tel when its in the hundreds, or when it drops down to 50FPS, can notice changes when FPS are between 60 and 20 FPS, that said, im playing FC3 at the mo, and on ultra and getting 30 to 50 and thats smooth enough, tho in BF3 that would be not so good, but i put that down to lack of Vram on my card, so play that game on high, i got a Palit GTX 680 4GB JetStream that i just got for £265 plus my card, and i will hold my hands up if i find it better with the FPS in the 100 plus, but i cant see that being the case, but i will report back if i ever get it back, its stuck in the post, dam snow 

P.S yes i did mis read that about the frame drops from 200  sorry, my bad


----------



## Calin Banc (Jan 23, 2013)

I can notice frame drops below 57, 58FPS or so. FC 3 at that FPS (30-50FPS), for me, it's a stuttering mess. That shows once more how relative this can be from one person to another.


----------



## TAZ007 (Jan 23, 2013)

Calin Banc said:


> I can notice frame drops below 57, 58FPS or so. FC 3 at that FPS (30-50FPS), for me, it's a stuttering mess. That shows once more how relative this can be from one person to another.



well again i can crank the aa up and the other setting and get around 20 to 40 and still its smooth as on FC3, and id be more than happy to try and make a vid and run around, the only reason i turn the aa and select ssao is because of heat issues, and thats the only reason im changing my card tbh, and £265 for a Palit GTX 680 4GB JetStream was too good to turn down  otherwise id be just as happy with a 7950 or 670, id like the 660ti to if it had a little more bandwidth, but again it not just the hardware in play here, its the games you play too, if you have a card with 2gig of ram you should be fine, and a good fast SSD helps too, with out knowing what your hardware is it hard to say, just remember blu ray is only 24 FPS and the best part it plays smooth, tends to judder when camrea is panning around but best part its smooth as tho a few more FPS here would help out there but im talking 10 FPS more


----------



## Calin Banc (Jan 23, 2013)

Don't compare movies with games, they are using different techniques for "render". If you feel it smooth at 20-40FPS good for you, for me it isn't and there is a serious gap between 30FPS and 60FPS, even capped.


----------



## TAZ007 (Jan 23, 2013)

Calin Banc said:


> Don't compare movies with games, they are using different techniques for "render". If you feel it smooth at 20-40FPS good for you, for me it isn't and there is a serious gap between 30FPS and 60FPS, even capped.



just had another go and maxed it all out, got low 17 high 27 and mostly 20 to 22 FPS, it was just playable but defo not smooth plus Vram almost maxed out too, smooth with no AA and set at SSAO just dont look as nice, but thats i can live with


----------



## WhiteLotus (Jan 23, 2013)

Calin Banc said:


> Don't compare movies with games, they are using different techniques for "render". If you feel it smooth at 20-40FPS good for you, for me it isn't and there is a serious gap between 30FPS and 60FPS, even capped.



Question:

When you watch a film, does it feel "life like" or "suitable to be watched without any issue"?

Because films are played back in 24fps as standard, only more recently you had The Hobbit come out in 48fps - a format you can only watch a select theatres that have the ability to play 48fps films.

And to me, movies look just fine as they are. It is that reason why I always "lol" at people claiming they "need" 50+ fps for good game play.


----------



## Finners (Jan 23, 2013)

im firmly in the 60 camp, bf3 and far cry 3 to me are not smooth below 60. each to the own i guess. 

I dont think there is a need to be abusive towards other members and TAZ you mock people buying high end cards and have just brought a second hand 680? for £265 + a 560Ti 448. you conveniently missed that second part out


----------



## TAZ007 (Jan 23, 2013)

WhiteLotus said:


> Question:
> 
> When you watch a film, does it feel "life like" or "suitable to be watched without any issue"?
> 
> ...



Tbh with you, when at the cinema i cant recall noticing any juddering or flaws, its only in certain scenes on blu ray when the camera pans left to right fast do i see it, could be down to the hardware or the tv, but to be fair, 3D gaming is totally different tho because its much harder to to make a game look fluid, hence the reason you have motion blurring to try and make it more natural, i never look at how high frame rates are, i look at what the lows are cos its minimum frame rates you should worry about


----------



## TAZ007 (Jan 23, 2013)

Finners said:


> im firmly in the 60 camp, bf3 and far cry 3 to me are not smooth below 60. each to the own i guess.
> 
> I dont think there is a need to be abusive towards other members and TAZ you mock people buying high end cards and have just brought a second hand 680? for £265 + a 560Ti 448. you conveniently missed that second part out



Abusive? for not sharing the opinion that 200 FPS equals smooth game play?

Mocking? i was going to get a New AMD 7950 for £250, mostly cos i need more than 1.25GB of ram, not cos i want need more FPS and cos i intend to buy Crysis 3, so what i was mocking as you say, making a point is what i say, and that is if i had a one 680 now never mind two, id not feel the need to sell up to buy the 780, now if thats being abusive or mocking then best ban me now, otherwise im entitled to think your slighty mad for parting with £800 or what ever they will be priced at when you had two 680's already then im guilty as charged, As for QUBIT, pointed out it was just my opinion and not to take it personal but sounds like you cant handle that, but it ok for some to argue the toss over something that is yet to be released, maybe its just me that mad  

As for missing out the £265 + a 560Ti 448 maybe you should read the post where i did not forget to mention that, and if you be so kind as to read every post start to finish you would see that this is not about noticing a difference below 60 FPS, but the difference between 100 and 200 FPS being noticeable


----------



## Calin Banc (Jan 23, 2013)

WhiteLotus said:


> Question:
> 
> When you watch a film, does it feel "life like" or "suitable to be watched without any issue"?
> 
> ...



Like I've said, cameras are using a different approach compared to a video game engine to "render" a movie. I don't know for sure and you'll have to look it up if you want to, but in a movie, the way frames are located during a second, creates some sort of more natural motion blur (they sort of overlap) , which gives the illusion of movement. In a game, those 24FPS are rendered straight in a row without overlapping. Of course, some advanced motion blur technologies in games may help, but they don't really recreate the same illusion - that's way some games like Crysis are smoother at 30FPS while others are not. 

A game session filmed and then put on youtube, usually hides stutter, low FPS and frame drops pretty well. As for the 30 vs. 60 fps, take a look at this - http://boallen.com/fps-compare.html

I'm sure if I can make a guy play for whole day (even a few hours would be enough) games at a rock solid 60 FPS and then make him play at 25 to 30FPS, it will observe the gap in a second. A friend of mine already did that while we were playing some racing games. The higher you go in FPS, the more life like experience you get.


----------



## TAZ007 (Jan 23, 2013)

Calin Banc said:


> A game session filmed and then put on youtube, usually hides stutter, low FPS and frame drops pretty well. As for the 30 vs. 60 fps, take a look at this - http://boallen.com/fps-compare.html
> 
> I'm sure if I can make a guy play for whole day (even a few hours would be enough) games at a rock solid 60 FPS and then make him play at 25 to 30FPS, it will observe the gap in a second. A friend of mine already did that while we were playing some racing games. The higher you go in FPS, the more life like experience you get.



Good example, i think thats why in FC3 i can get away with lower FPS compared to BF3 because most of the time the action is slow, where in BF3 there is a lot more going on, but as  have said above 60 FPS is the sweet spot, but after that i really dont see anything as obvious as in the link you put on here


----------



## Calin Banc (Jan 23, 2013)

You'll probably need a 120hz monitor for that.


----------



## Slizzo (Jan 23, 2013)

WhiteLotus said:


> Question:
> 
> When you watch a film, does it feel "life like" or "suitable to be watched without any issue"?
> 
> ...





TAZ007 said:


> just had another go and maxed it all out, got low 17 high 27 and mostly 20 to 22 FPS, it was just playable but defo not smooth plus Vram almost maxed out too, smooth with no AA and set at SSAO just dont look as nice, but thats i can live with



FFS, it's been mentioned here before but film 24fps CANNOT be directly compared to video (and PC) games' framerates.  Films use technologies such as motion blur and other tricks to make the 24fps seem smooth. I hate motion blur in games so I turn it off; that may be a reason why you think that 20-30FPS is smooth in a game if you have it on.

If my game is not running at 60FPS or more, I notice a very negative impact on my gaming experience. It jolts you out of the world that you're trying to be immersed in.




Also, realize that low framerates have an impact on not only the visual experience of a game, it can also have negative impacts on the control you have in the game. Mouse inputs and keyboard inputs can be slow/not smooth when your framerate dips.


EDIT: I also want to add, if you watch a film before they add all the post processing to make the film appear smooth, you'll see what I'm speaking of. It will look like you're watching a colorful flipbook, for lack of a better term.


----------



## qubit (Jan 23, 2013)

Slizzo said:


> FFS, it's been mentioned here before but film 24fps CANNOT be directly compared to video (and PC) games' framerates.  Films use technologies such as motion blur and other tricks to make the 24fps seem smooth. I hate motion blur in games so I turn it off; that may be a reason why you think that 20-30FPS is smooth in a game if you have it on.
> 
> If my game is not running at 60FPS or more, I notice a very negative impact on my gaming experience. It jolts you out of the world that you're trying to be immersed in.
> 
> Also, realize that low framerates have an impact on not only the visual experience of a game, it can also have negative impacts on the control you have in the game. Mouse inputs and keyboard inputs can be slow/not smooth when your framerate dips.



Very well said.

I'm considering writing a forum post or editorial about frame rate (low to the very high) judder, motion blur etc as there's a lot of misconceptions about this and I want people to understand the subject.


----------



## TAZ007 (Jan 23, 2013)

Slizzo said:


> If my game is not running at 60FPS or more, I notice a very negative impact on my gaming experience. It jolts you out of the world that you're trying to be immersed in
> 
> Also, realize that low frame rates have an impact on not only the visual experience of a game, it can also have negative impacts on the control you have in the game. Mouse inputs and keyboard inputs can be slow/not smooth when your framerate dips.



I agree with you, and im not saying any different, what i do not agree with is that you can see a difference between 100 fps and 200 FPS, below 60 FPS yes, but in some games you can get away with it as i can in FC3 im getting 40 to 60 and its perfectly playable, 17 to 40 tho it is not so playable, like you said it much harder to control, not so bad visually tho


----------



## TSX420J (Jan 23, 2013)

In my opinion; I'd like to have the best possible setup, that way I don't have to upgrade for a few years. Although it would be cool to see games that are so realistic that they would utilize the power of these new cards. Kind of like when Crysis came out, except I don't want a game to shit on my ultra powerful card and then piss on it too like Crysis did.


----------



## TAZ007 (Jan 23, 2013)

qubit said:


> Very well said.
> 
> I'm considering writing a forum post or editorial about frame rate (low to the very high) judder, motion blur etc as there's a lot of misconceptions about this and I want people to understand the subject.



well might save you some time  but seeing is believing, and i trust my eyes, the thing is without having something side by side its going to be very hard to convince people with words, never mind prove, and even then they might not see what you see, same with music, some cant tell the difference between mp3 file and FLAC, even when play through ipod vrs cowan with ipod head phone vrs sennheizer's its just the way of the world, some cant see the wood through the trees and that me when it comes to 60 fps or more


----------



## TAZ007 (Jan 23, 2013)

TSX420J said:


> In my opinion; I'd like to have the best possible setup, that way I don't have to upgrade for a few years. Although it would be cool to see games that are so realistic that they would utilize the power of these new cards. Kind of like when Crysis came out, except I don't want a game to shit on my ultra powerful card and then piss on it too like Crysis did.



i know what ur saying mate, think Crysis 3 going to be like that  BF3 was what force me to change from 775 platform to 1155, but i did not do my research on the 1GB vrs 2GB cards, so had to adjust on the i candy to get decent FPS

I think that Nvida and AMD and game makers are all in cahoots with each other in order to get our money from us


----------



## TSX420J (Jan 23, 2013)

TAZ007 said:


> BF3 was what force me to change from 775 platform to 1155



Me too! That and my mobo finally crapping out on me. Ended up parting out my computer about a year ago. Been without a desktop for a year (waiting to see what happens with hardware). Just put together my new desktop a few weeks ago and now I'm just waiting on the video card to come out. Cant wait to play BF3 again (was playing on the old setup but it couldn't handle BF3 too good).



TAZ007 said:


> I think that Nvida and AMD and game makers are all in cahoots with each other in order to get our money from us



Lol, its a conspiracy. I swear man, these companies love taking our money over and over. Look at 4k TVs, there isn't any media to support it and people are already anxious to buy them for $$,$$$.$$ to watch 30 second 4K clips. LOL we're being brainwashed.


----------



## TAZ007 (Jan 23, 2013)

TSX420J said:


> Me too! That and my mobo finally crapping out on me. Ended up parting out my computer about a year ago. Been without a desktop for a year (waiting to see what happens with hardware). Just put together my new desktop a few weeks ago and now I'm just waiting on the video card to come out. Cant wait to play BF3 again (was playing on the old setup but it couldn't handle BF3 too good).



well i had time out too after BF2 all the hacker kinda got me down tbh, and lost touch with what was going on in the pc world, so got back into it a few weeks after BF3 came out, bought is and my duo core and 4850 could just about play it on low everything but it was pants tbh, so i got the 560ti and that made no difference at all, so changed to a Quad core and that made a really big difference, more so than changing from quad to i5 2500k to be honest, but all in all im happy as now, like you i buy with the hope that it will last years rather than months, il be quite gutted if Crysis is to much for the 680 4GB, i dont play that many game tbh, plus i suck at them lol, i shud have stuck with pacman 



TSX420J said:


> Lol, its a conspiracy. I swear man, these companies love taking our money over and over. Look at 4k TVs, there isn't any media to support it and people are already anxious to buy them for $$,$$$.$$ to watch 30 second 4K clips. LOL we're being brainwashed.



Yep i know, and thats what im getting at with people wanting to sell up there 680's in order to get the 780, whilst i dont get why im glad they do, cos it drives the price down for the rest of us that sit and wait, plus flood the market with second hand bargains, Nvida well over priced but people are happy to pay for it, but at the moment AMD driver have come good, and now they on par with the 680/70/60 on nearl every game including BF3 where there was a big gap in FPS making the 7950/70 with up to five free on some making it a great buy, tho they was price on release date, cos they know some will pay it


----------



## qubit (Jan 23, 2013)

TAZ007 said:


> well might save you some time  but seeing is believing, and i trust my eyes, the thing is without having something side by side its going to be very hard to convince people with words, never mind prove, and even then they might not see what you see, same with music, some cant tell the difference between mp3 file and FLAC, even when play through ipod vrs cowan with ipod head phone vrs sennheizer's its just the way of the world, some cant see the wood through the trees and that me when it comes to 60 fps or more



Yeah, people may not see or understand these things sometimes and that's a shame. However, this article or forum post will be drawn from my own experiences, knowledge and general reading about the subject and I know it quite well (not bragging). I do this, because I want to educate people, not score points as some mistakenly think. Unfortunately, one sometimes tends to get flamed for putting good info out there, but that's never stopped me... 

I want to stress that I'm not pointing fingers here at anyone in particular, it's just what I've generally experienced when writing articles like this.

I've got so much on nowadays, that I don't know if I'll get round to it any time soon, so best not to hold your breath!


----------



## sergionography (Jan 24, 2013)

WhiteLotus said:


> Question:
> 
> When you watch a film, does it feel "life like" or "suitable to be watched without any issue"?
> 
> ...



yes that's because filming is consistent fps and includes no interactivity. 3d gaming renders different, techreport explains the in the second theory were the gpu might be stuttering for half a second then pump all 60frames in the second half, the reading will say 60fps but for the viewer that's a half a second lag which in a first person shooter that can be the game changer. And this is why consoles are much better optimized its because developers pay very close attention to optimize the hardware not only for peak or average fps but for consistent rendering because that burst of 60 frames in the second half of the second can be filling up all the memory buffers for no practical reason and causing inefficiencies, even amd mentioned they're still working on fully optimizing the memory on GCn for best efficiency.


----------



## Calin Banc (Jan 24, 2013)

The console world is not a perfect one, there are still moments when the FPS drops and stutters. It only feels much better when it's at a constant 30FPS, which you can manually set on PC also. 

As an example, without a cap at 59FPS, BF 3 is not that smooth even though it hits 70 or 90FPS. Once I set the limitation in place, all is good. At least on the PC, if a setting gives you trouble, it can be left out and after a while, with a new hardware, it can be used and by that, the game has a new life. On a console, it would be left out from the start.


----------



## Prima.Vera (Jan 24, 2013)

Exactly my point. Almost all of the latest games on consoles are capped at 30fps and nobody is complaining. I think people are just getting to depended on numbers and stupid FRAPS.


----------



## qubit (Jan 24, 2013)

Prima.Vera said:


> Exactly my point. Almost all of the latest games on consoles are capped at 30fps



That's really rubbish - laggy and juddery all the way. I'm glad I don't have a console.

I game at 120Hz with LightBoost on and it's awesome.  Yes, the difference is _highly_ visible.


----------



## Calin Banc (Jan 24, 2013)

Prima.Vera said:


> Exactly my point. Almost all of the latest games on consoles are capped at 30fps and nobody is complaining. I think people are just getting to depended on numbers and stupid FRAPS.



Yeah and some never complained about DVD quality, or HD quality or... Just because some gamers out there are fine with a low quality product, it doesn't mean it's good enough for everyone. Or that value of 30 can't be significantly improved and with it, the end user's experience.


----------



## Prima.Vera (Jan 24, 2013)

qubit said:


> That's really rubbish - laggy and juddery all the way. I'm glad I don't have a console.



I played Crysis as a FPS and NFS Hot Pursuit as a race game, and both looked OK and very playable even if they were capped at 30fps. I have no issues either, so I think people are overreacting and exaggerating with this FPS crap discussion...


----------



## tokyoduong (Jan 24, 2013)

qubit said:


> That's really rubbish - laggy and juddery all the way. I'm glad I don't have a console.
> 
> I game at 120Hz with LightBoost on and it's awesome.  Yes, the difference is _highly_ visible.



I guess you must never go to theaters and watch regular TV?


----------



## qubit (Jan 24, 2013)

Prima.Vera said:


> I played Crysis as a FPS and NFS Hot Pursuit as a race game, and both looked OK and very playable even if they were capped at 30fps. I have no issues either, so I think people are overreacting and exaggerating with this FPS crap discussion...



Have you ever played these games vsynced to 60Hz or 120Hz? I suspect you haven't. If you haven't, then that 30Hz cap might well seem ok to you.

In addition, LightBoost, makes a _very_ big difference by eliminating motion blur. The effect is nothing short of awesome. You basically get to have your cake and eat it.


----------



## Aquinus (Jan 24, 2013)

qubit said:


> That's really rubbish - laggy and juddery all the way. I'm glad I don't have a console.



So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.


----------



## qubit (Jan 24, 2013)

Aquinus said:


> So I take it that you find Blu-Ray video "laggy and *jittery*" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).
> 
> I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).
> 
> ...



Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.

I might just write that article on framerate sooner rather than later.


----------



## Aquinus (Jan 24, 2013)

qubit said:


> Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.
> 
> I might just write that article on framerate sooner rather than later.



Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.


----------



## qubit (Jan 24, 2013)

Aquinus said:


> Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.



I doubt that "most" people disagree. There's just a few vocal ones that insist on trying to negate what I'm saying, lol.

This isn't rocket science. I've seen all this stuff for myself and the effects are all very obvious, so I know I'm right. There's no way someone can "prove" me wrong with a counter argument, as it's inevitably flawed.


----------



## NeoXF (Jan 24, 2013)

Crap Daddy said:


> Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p
> 
> http://tpucdn.com/reviews/AMD/HD_7970_GHz_Edition/images/perfrel_1920.gif
> 
> Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.



Aside from what other people stated about 85% of GTX 690, I'd also want to point out that those are pre-Catalyst 12.11b/13.1WHQL scores, so the difference might be less than 30%... Even so, I take most these wide-margin calculations with shovel full of salt.


----------



## Calin Banc (Jan 24, 2013)

Aquinus said:


> So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).
> 
> I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).
> 
> ...



Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Plus, a higher frame rate gives better control over your avatar.


----------



## tokyoduong (Jan 24, 2013)

Calin Banc said:


> Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.
> 
> Plus, a higher frame rate gives better control over your avatar.



Higher framerates can certainly lower frame latencies. But like the techreport reviews have proven that it can go from 2ms between frames all the way to 50+ms between frames and still maintain a higher avg frame rates. There's no special "packed" way a movies does, it's just a smooth consistent frame rate throughout. That is why 24 fps works in theaters and 30/60 fps works for TVs. Nobody ever complained about that! Your brain will adjust to a consistent frame rates(eg. you can't see fluorescent lights blinking). Inconsistent frame rates will need to have a minimum latency for you to perceive it as smooth. I think anything below 16.7ms(60hz) is hard to detect. I can tell a slight difference between 60 and 75hz but not too many people can.



qubit said:


> Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.
> 
> I might just write that article on framerate sooner rather than later.



Write it and stop wasting data and bandwidth posting this crap. You're just being rude and annoying.


----------



## Slizzo (Jan 24, 2013)

Aquinus said:


> So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).
> 
> I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).
> 
> ...





Aquinus said:


> Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.



I swear I'm going to shoot the next person that tries to compare movie framerates to videogame framerates.



Calin Banc said:


> Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.
> 
> Plus, a higher frame rate gives better control over your avatar.



Apparently we can try to inform these people until we're blue in the face but they'll keep skipping over what we're saying (and backing up with facts) and going on comparing movies/TV to games...


----------



## tokyoduong (Jan 24, 2013)

Slizzo said:


> I swear I'm going to shoot the next person that tries to compare movie framerates to videogame framerates.



whoa, raging much? I agree with you but frame rates are not that serious. Let's just solve world hunger or cure cancer


----------



## HammerON (Jan 24, 2013)

I think it is time to move on folks. Feel free to continue this conversation about frame rates in a new thread or in PM's.
Carry on


----------



## qubit (Jan 24, 2013)

tokyoduong said:


> Write it and stop wasting data and bandwidth posting this crap. You're just being rude and annoying.



The only people being rude, annoying and thread crapping are those like yourself who take a pop at me for no good reason. 

Anyway, it's the wrong thread to talk about framerates here, like HammerON said.


----------



## Calin Banc (Jan 24, 2013)

tokyoduong said:


> There's no special "packed" way a movies does



I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.

Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat. 

And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it.


----------



## jihadjoe (Jan 25, 2013)

The issue I have with that 85% of a 690 comment is that there's no unit of measurement.

Is it 85% in game performance? Then that pretty much says right there how the GK110 card will perform.

Is it 85% of the render/shader performance? In this case the actual in-game performance might be even better than the 690, because there aren't any SLI scaling issues to deal with.

I'm expecting this to play out much like 560Ti vs 580, which is really where GK104 probably stands against GK110.


----------



## Am* (Jan 25, 2013)

This had better have all of the 15 SMX units enabled. This card should've launched a year ago, I doubt anybody is going to want or accept anything less than a perfect card, especially at the rumoured price of $900...Nvidia have had a year to get their shit together so this had better be their full 2880 shader GPU.


----------



## TAZ007 (Jan 25, 2013)

Calin Banc said:


> I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.



Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view  its cos that how we see things


----------



## HammerON (Jan 25, 2013)

Calin Banc said:


> I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.
> 
> Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.
> 
> And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it.





TAZ007 said:


> Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view  its cos that how we see things



Last warning. Any more discussion about frame rates/movies will lead to an infraction.


----------



## Mombasa69 (Jan 25, 2013)

*Worth the wait*

I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.


----------



## qubit (Jan 25, 2013)

Mombasa69 said:


> I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.



Would you pay $900 for it though? I'd have to be very well off to spend that much on a card.


----------



## Calin Banc (Jan 25, 2013)

If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.


----------



## erocker (Jan 25, 2013)

Calin Banc said:


> If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.



I don't get this.. You're saying with every new generation of cards, the price should increase by almost twofold? 

I remember when the 5870 replaced the 4870... The performance was almost double and the price was actually cheaper!


----------



## Slizzo (Jan 25, 2013)

Calin Banc said:


> If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.



I dont' think it makes sense at this price point. If it's coming out to pre-empt AMD's 8 series cards, it should be priced around what the GTX680 was when it was released.

I suppose this also depends on what this card is branded as.


----------



## NdMk2o1o (Jan 25, 2013)

Mombasa69 said:


> I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.



Only a retard would use such a sweeping generalisation  but seriously I probably do upgrade once a year, I have a Sapphire Vapor-X 7950 (overclocks to 1200/1450 on stock volts) 

I buy mainly mid-high end (last few cards: GTX 470, GTX 570, HD 7950) so to some a marginal upgrade every generation but the way I look at it is that I sold my 470 for 2/3 the cost of the 570 and the 570 was better at stock, able to far exceed the 470 OC, used less power ran cooler etc etc. The same is true when I sold my 570 and went with my current 7950, I paid a small upgrade fee after I sold my card and benefitted from a newer architecture, far greater performance when taking overcloking into consideration and of course it's the latest gen hardware so will hold the same kind of resale value as the previous cards when the new generation of cards come out enabling me to again upgrade for a marginal expense whilst having the latest gen hardware, it's a no brainer to me.


----------



## Calin Banc (Jan 25, 2013)

Nope erocker, the huge ripoff price for the 7xxx series kept me away from it until the magic driver and bundle came true. What I'm saying guys, is that no matter how high the price will be, there are folks that will pay... at least if the performance is wright. 

At the same time, if this would launch at the same price point as gtx 680, it will mean price drops for every card underneath and the killing of gtx680/690. Sure, that should happen if we are talking about a new series, but at least for now, we know to little. If AMD launches at a better price/performance, then it lowers the prices (they have a margin to play with), if not... not. It's a win-win for them.


----------



## qubit (Jan 25, 2013)

erocker said:


> I don't get this.. You're saying with every new generation of cards, the price should increase by almost twofold?
> 
> I remember when the 5870 replaced the 4870... The performance was almost double and the price was actually cheaper!



Yeah, it's annoying how the two graphics camps appear to have reversed the trend of a fixed price point (or lower one) for the top models while delivering better performance every generation. The price has been creeping up for some time now, while performance improvements have only been incremental. Presumably thermal, wattage and physical GPU & card size limits are to blame for this trend?

I paid the full £400 for my GTX 580 and that was too steep as it is. I'm not paying £500 for the next one, no way.

I still like that kitty. I'm a sucker for them.


----------



## sergionography (Jan 25, 2013)

jihadjoe said:


> The issue I have with that 85% of a 690 comment is that there's no unit of measurement.
> 
> Is it 85% in game performance? Then that pretty much says right there how the GK110 card will perform.
> 
> ...



its whichever inflates the results, so in this case 85% of game performance, or maybe even worse*85% of compute capability! which gk104 has it crippled
however tho its safe to say its 25-30% faster than gtx680 otherwise its poitlless to release, but its common knowledge now that with the bigger die size higher end models u lose the efficiency advantage of the middle or lower end models of the same architecture, as these are tuned for highest performance rather than performance per watt, but with high clock count and low clockspeed that actually might be different, its just a matter of finding the best diesize/performance/efficieny ratio, and from there based on cost they can prioritize smaller die higher clock(less expensive, but more leakage), or bigger die size, lower clocks(more expensive, less leakage) . i just wish there was any reviews looking into the power consumption scaling with different clocks on gcn and kepler architectures


----------



## TAZ007 (Jan 25, 2013)

Mombasa69 said:


> I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.



Or some one just wanting cooler, less power hungry all in one solution, many get rid of there sli setup for a single card solution, they not retarded thank you


----------



## Mombasa69 (Jan 26, 2013)

qubit said:


> Would you pay $900 for it though? I'd have to be very well off to spend that much on a card.



You do have a point, that's £569.62 in my currency pound sterling and we also have value added tax another 20% so £683.5 or $1079.93 in USD (minus delivery charges ofc). SIGH, is expensive.


----------



## NeoXF (Jan 26, 2013)

Personally, I'm hoping to get a pair of GTX 660 soon, before newer "gen" stuff doesn't kick in, so I start having enthusiast's remorse or something, like I feel going for Socket 2011 is a bad idea since Haswell is so close, and Steamroller not far behind either... but the news of a possible 8-core iteration (hence a upgrade path) calmed that down.

After that, I might sell off one of the 660s (2014+), get a high-end single GPU Radeon HD 9000 (hopefully a generation that will be more high-res friendly) and keep the other 660 for PhysX (LOL), and 3D movies and so, since I have a BenQ XL2420TX, that is much more nVidia 3D-friendly than AMD.

All in all, I'd get, for the time being, very close to GeForce Titan performance for a lot less...


----------



## sergionography (Jan 27, 2013)

NeoXF said:


> Personally, I'm hoping to get a pair of GTX 660 soon, before newer "gen" stuff doesn't kick in, so I start having enthusiast's remorse or something, like I feel going for Socket 2011 is a bad idea since Haswell is so close, and Steamroller not far behind either... but the news of a possible 8-core iteration (hence a upgrade path) calmed that down.
> 
> After that, I might sell off one of the 660s (2014+), get a high-end single GPU Radeon HD 9000 (hopefully a generation that will be more high-res friendly) and keep the other 660 for PhysX (LOL), and 3D movies and so, since I have a BenQ XL2420TX, that is much more nVidia 3D-friendly than AMD.
> 
> All in all, I'd get, for the time being, very close to GeForce Titan performance for a lot less...



um no, the only time a dual gpu solution was better that the top end single gpu was during the gtx400 series were the 480 ran super hot and couldnt overclock, so 2 gtx 460s totaled more cores and at much higher clocks(my gtx460 was stock overclocked at 800mhz)
in this case however 2 gtx660s wont even beat titan in theory to do it in a practical world with sli lol


----------



## NeoXF (Jan 28, 2013)

sergionography said:


> um no, the only time a dual gpu solution was better that the top end single gpu was during the gtx400 series were the 480 ran super hot and couldnt overclock, so 2 gtx 460s totaled more cores and at much higher clocks(my gtx460 was stock overclocked at 800mhz)
> in this case however 2 gtx660s wont even beat titan in theory to do it in a practical world with sli lol



I don't remember stating anywhere that they'd be faster than this so called "Titan".
Do you research, 2 GTX 660 are QUITE faster than a GTX 680, while being cheaper (where I live) too.


----------



## Slizzo (Jan 28, 2013)

NeoXF said:


> I don't remember stating anywhere that they'd be faster than this so called "Titan".
> Do you research, 2 GTX 660 are QUITE faster than a GTX 680, while being cheaper (where I live) too.



That's usually the case with any mid-ranged solution. GTX460, GTX560Ti, all in SLI were faster than the GTX480 and GTX580.

However I'd still rather have the single GTX480 or GTX580. Won't have to deal with scaling issues, and power/noise/heat requirements are lower for the single card.


----------



## Fluffmeister (Feb 17, 2013)

GTX Titan pictured:











http://videocardz.com/39618/nvidia-geforce-gtx-titan-pictured


----------



## DarkOCean (Feb 17, 2013)

Fluffmeister said:


> GTX Titan pictured:
> 
> http://videocardz.com/images/2013/02/GeForce-GTX-Titan-Picture.jpg
> http://videocardz.com/images/2013/02/GeForce-GTX-Titan-Back.jpg
> ...



no backplate?

edit:
no IHS ...nice


----------



## radrok (Feb 17, 2013)

That confirms we won't see custom PCBs from AIOs then.

Can't wait to preorder a couple. Been sitting on these 6990s for too much.


----------



## dj-electric (Feb 17, 2013)

I can almost hear w1zz giggling to this thread.


----------



## NHKS (Feb 17, 2013)

radrok said:


> That confirms we won't see custom PCBs from AIOs then.
> 
> Can't wait to preorder a couple. Been sitting on these 6990s for too much.


 quite likely the case... 
but that shouldn't stop you from water-cooling the card. EK is already asking people to vote for the design on its upcoming block(for Titan)

http://thinkcell.ekwb.com/idea/new-full-cover-block-design---choose-your-best
http://www.ekwb.com/shop/blocks/vga-blocks/fc-geforce/geforce-titan-series.html

a discussion thread has been started for this: http://www.techpowerup.com/forums/showthread.php?p=2847475#post2847475


----------



## Xzibit (Feb 17, 2013)

DarkOCean said:


> no backplate?



GTX 690 didnt have a backplate either.

Doesnt look that Titan'e compared to the GTX 690. They should have gone with a slightly more aggressive look.  Gunmental maybe to make it look more rugged then the 690. The blower design makes it look a bit too plain.  The GTX 690 was okay since it had a big fan in the middle.  The blower design leaves a lot of window to see the dust accumulate between the heat sink.  Maybe a etched desing or something to spice it up.


----------



## Fluffmeister (Feb 17, 2013)

It's a beautifully engineered card and is bound to run suitably cool and quiet for a card with this level of performance.

The shroud also comes off easy enough for quick cleaning.


----------



## radrok (Feb 17, 2013)

NHKS said:


> quite likely the case...
> but that shouldn't stop you from water-cooling the card. EK is already asking people to vote for the design on its upcoming block(for Titan)
> 
> http://thinkcell.ekwb.com/idea/new-full-cover-block-design---choose-your-best
> ...



EK? No thanks, I think I'll pass 

I already have Watercool/Aquacomputer in mind.


----------



## DarkOCean (Feb 17, 2013)

Xzibit said:


> GTX 690 didnt have a backplate either.


The 690 doesn't have any ram chips on the back so isn't needed at all but for the Titan it would've been nice.


----------



## radrok (Feb 17, 2013)

DarkOCean said:


> The 690 doesn't have any ram chips on the back so isn't needed at all but for the Titan it would've been nice.



It's GDDR5, it doesn't need more than case airflow to stay cool


----------



## DarkOCean (Feb 17, 2013)

radrok said:


> It's GDDR5, it doesn't need more than case airflow to stay cool



maybe, i only said i would've been nice and better looking  and better cooling never hurts (for example i loved the 65nm gtx 260's look) imho.


----------



## dj-electric (Feb 17, 2013)

The Titan begs for a backplate. For aesthetics, at least.


----------



## Filiprino (Feb 17, 2013)

Ahh... that design reminds me the good old 8800GTS days.


----------



## HumanSmoke (Feb 17, 2013)

Nvidia did pretty well on keeping a clamp on leaks this time around. The big reveal has at least some anticipation attached to it.


----------



## LAN_deRf_HA (Feb 17, 2013)

NHKS said:


> quite likely the case...
> but that shouldn't stop you from water-cooling the card. EK is already asking people to vote for the design on its upcoming block(for Titan)
> 
> http://thinkcell.ekwb.com/idea/new-full-cover-block-design---choose-your-best
> ...



I hope people send them a message and vote 3, but you have to sign up to vote so they'll just get votes from people who already bought circle stuff and want it to match up.


----------



## AsRock (Feb 17, 2013)

Fluffmeister said:


> It's a beautifully engineered card and is bound to run suitably cool and quiet for a card with this level of performance.
> 
> The shroud also comes off easy enough for quick cleaning.
> 
> http://img109.imageshack.us/img109/1732/capture6l.jpg



Expected more of a cooler than this my self. I do think it is beautifully engineered but these coolers need a way for the user to clean the dust better as it will build up right at the front of the cooler.


----------



## TheoneandonlyMrK (Feb 17, 2013)

AsRock said:


> Expected more of a cooler than this my self. I do think it is beautifully engineered but these coolers need a way for the user to clean the dust better as it will build up right at the front of the cooler.



Looks very easy to clean to me and genuine pic imho


----------



## Prima.Vera (Feb 18, 2013)

long card is long. Longer than 28cm??


----------



## qubit (Feb 18, 2013)

So what time GMT does the NDA lift? Is it 6pm?


----------



## TheHunter (Feb 18, 2013)

^
Apparently its postponed to tomorrow..

US president day lol


----------



## DarkOCean (Feb 18, 2013)

i've heard it was potponed to 19 and reviews on 21 feb .


----------

