Monday, January 21st 2013
NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
Source:
SweClockers
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
In addition, LightBoost, makes a very big difference by eliminating motion blur. The effect is nothing short of awesome. You basically get to have your cake and eat it. :D
I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).
So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.
The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.
I might just write that article on framerate sooner rather than later.
This isn't rocket science. I've seen all this stuff for myself and the effects are all very obvious, so I know I'm right. There's no way someone can "prove" me wrong with a counter argument, as it's inevitably flawed.
Plus, a higher frame rate gives better control over your avatar.
Carry on
Anyway, it's the wrong thread to talk about framerates here, like HammerON said.
Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.
And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it. :)
Is it 85% in game performance? Then that pretty much says right there how the GK110 card will perform.
Is it 85% of the render/shader performance? In this case the actual in-game performance might be even better than the 690, because there aren't any SLI scaling issues to deal with.
I'm expecting this to play out much like 560Ti vs 580, which is really where GK104 probably stands against GK110.
I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.
I remember when the 5870 replaced the 4870... The performance was almost double and the price was actually cheaper!
I suppose this also depends on what this card is branded as.
I buy mainly mid-high end (last few cards: GTX 470, GTX 570, HD 7950) so to some a marginal upgrade every generation but the way I look at it is that I sold my 470 for 2/3 the cost of the 570 and the 570 was better at stock, able to far exceed the 470 OC, used less power ran cooler etc etc. The same is true when I sold my 570 and went with my current 7950, I paid a small upgrade fee after I sold my card and benefitted from a newer architecture, far greater performance when taking overcloking into consideration and of course it's the latest gen hardware so will hold the same kind of resale value as the previous cards when the new generation of cards come out enabling me to again upgrade for a marginal expense whilst having the latest gen hardware, it's a no brainer to me.
At the same time, if this would launch at the same price point as gtx 680, it will mean price drops for every card underneath and the killing of gtx680/690. Sure, that should happen if we are talking about a new series, but at least for now, we know to little. If AMD launches at a better price/performance, then it lowers the prices (they have a margin to play with), if not... not. It's a win-win for them.
I paid the full £400 for my GTX 580 and that was too steep as it is. I'm not paying £500 for the next one, no way.
I still like that kitty. I'm a sucker for them. :D