Monday, January 21st 2013
NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
Source:
SweClockers
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
see crap dadys chart prior in this thread ,85% of a 690's performance would hit 130% on that chart or, 30% fasted then the 7970 ,simples:cool:
also I may have been mislead but I thought Nvidia had put more double precision units in the gk110 ,are these counted amongst the 2880 shaders despite them being specialist units? ,not trying to spark a row im just interested and asking?
In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).
and all that your waffling can be restated via an Amd biased stance ie some games favour Amd gfx sooooo, that's why we read Tpu reviews ,that charts sound in my eyes bro ,wiz did it....
im out anyway dude your opinions all good ,I am an optimist too.
That's 66% right there.
A year later they followed up with the 8800GT that offered 90% of the performance for 1/2 of the price.
Just praying that lightning can, and does strike twice. :respect:
Having read this thread start to finish i have to wonder, i live and die by one rule, if it aint broke, then dont fix it, i wont buy a 780TI or equivalent unless a game comes out that my PC cant play, some of you spend all that money on having the latest and greatest yet all you seem to do is debate on forums and benchmark all day, a tad overkill dont ya think, all that money just for bragging rights, well more money than sense i think, i can play Farcry 3 on my 560Ti 448 on ultra settings, perfectly playable too with 30 to 50 fps, can play BF3 on high settings and get 50 to 80 FPS, tho this game and only this game in my collection pushes my 1.25 memory to its limit, hence why i play on high settings and not ultra, no this dont apply to everyone, just the die hards, the ones with 680 sli set ups getting all excited cos they cant wait to get the hands on a new 780 even tho the dont really need a 780 at this moment in time, well dont moan about the price, cos it people like you that make the price so high in the first place, Nvida and AMD know this, and if i was them id be screwing you for all the money i could, cos there is a market for it, and really i should not moan, cos as a result i get to you old 680 a lot less than what you paid for them, that the only numbers in interested in, £££ ching ching ;)
Now for the rest of us that have more sense than money that dont feel sad when our FPS drop below 100 and are quite happy playing games than worrying the toss about how much faster a unreleased GPU is going to be or how high our 3Dmark score is going to be, well just like to say hello to you all, im Taz, and if you like to shoot the shit outta me on BF3 then look out for T7BSY, but if you a sad fk that uses an aim bot and other hacks the GET A LIFE!
Anyway in short, i got better things to do than argue the toss over something that is yet to be released, Bytales Intresting read if nothing else ;). QUBIT if you think that 200 FPS will look smother than 100 you just kidding yourself to justify your outlay, most users only have 60MHz screens and the eye cant see beyond that anyway and thats a fact, cos if that was the case then we would only ever see half the picture when watching normal telly, if you have a 120MHz screen you might have a point tho, and even then most would not be able to tell the difference ;)
Judging by your very first post on TPU, it's clear that you just like a good flamebait rant, rather than presenting a coherent argument.
Therefore, I won't waste my time explaining to you where you've got it all so wrong about frame rates. I can't believe you replied all that to what was just a humorous remark to another member, lol.
So are you saying that all those that have 680sli setups are now insecure because the 780Ti is due out? i think not, i find it hard to believe that there is a game out there that would see off two 680's in sli? even with 3 monitor setup a single 680/7970 can play play all games out there
And last but not least, you reckon i will notice a drop from 200 FPS to 50, but not 100FPS to 50 lol, I would not notice either, the brain cant count what the eye cant see, thats not to say you might not feel it, but that wont apply to many, ask any console user, they dont worry about FPS and they still enjoy the game they play just as much as you or I
Only thing i would notice is stuttering when frame drop to 5 to 30 fps frap has nothing to do with what we are talking about, anything above 60 FPS is the sweet spot, but again, its just my opinion, not saying either of you are right, or wrong, just saying does not apply to most of us
Edit; sorry i did misread what you was trying to saying, so to a point i agree, but here is an example, i can play FARCRY 3 on ultra settings, NO AA and SSAO and @ 1920 x 1200 and get between 25 to 50 FPS and this is smooth as, Vram has not gone over 1GB, now on BF3 even on high settings i get 50 to 80 fps, but still in multiplayer i can run out off vram which can cause me lag or drop in frames that cause stuttering, but thats not down to gpu power, thats due to lack of Vram, which hits FPS and causes it to become not so smooth, and thats the only reason you would notice, to put it another way, two 560ti overclocked GPU to score 9500 in 3Dmark11, so as good as a single 680 or 7970, but even tho FPS are in the hundreds playing BF3 its still not smooth, get lag and frame drops due to the lack of Vram, so imo its not all about FPS that equal smooth game play, its all about getting the right balance to limit any bottle necks that you will have at some point!
I think something like 850-950mhz with 3gb seems more plausible and I hope its a real GK110 with all bells and whistles, not just another GK104 with more cores on it..
Also 800$ is a bit to much for me, but then again its some random number xD
I would go for 770GTX, i mean Titan. Hopefully for ~ 500$ max.
P.S yes i did mis read that about the frame drops from 200 ;) sorry, my bad