Monday, January 21st 2013
NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
Source:
SweClockers
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
When you watch a film, does it feel "life like" or "suitable to be watched without any issue"?
Because films are played back in 24fps as standard, only more recently you had The Hobbit come out in 48fps - a format you can only watch a select theatres that have the ability to play 48fps films.
And to me, movies look just fine as they are. It is that reason why I always "lol" at people claiming they "need" 50+ fps for good game play.
I dont think there is a need to be abusive towards other members and TAZ you mock people buying high end cards and have just brought a second hand 680? for £265 + a 560Ti 448. you conveniently missed that second part out
Mocking? i was going to get a New AMD 7950 for £250, mostly cos i need more than 1.25GB of ram, not cos i want need more FPS and cos i intend to buy Crysis 3, so what i was mocking as you say, making a point is what i say, and that is if i had a one 680 now never mind two, id not feel the need to sell up to buy the 780, now if thats being abusive or mocking then best ban me now, otherwise im entitled to think your slighty mad for parting with £800 or what ever they will be priced at when you had two 680's already then im guilty as charged, As for QUBIT, pointed out it was just my opinion and not to take it personal but sounds like you cant handle that, but it ok for some to argue the toss over something that is yet to be released, maybe its just me that mad ;)
As for missing out the £265 + a 560Ti 448 maybe you should read the post where i did not forget to mention that, and if you be so kind as to read every post start to finish you would see that this is not about noticing a difference below 60 FPS, but the difference between 100 and 200 FPS being noticeable
A game session filmed and then put on youtube, usually hides stutter, low FPS and frame drops pretty well. As for the 30 vs. 60 fps, take a look at this - boallen.com/fps-compare.html
I'm sure if I can make a guy play for whole day (even a few hours would be enough) games at a rock solid 60 FPS and then make him play at 25 to 30FPS, it will observe the gap in a second. A friend of mine already did that while we were playing some racing games. The higher you go in FPS, the more life like experience you get.
If my game is not running at 60FPS or more, I notice a very negative impact on my gaming experience. It jolts you out of the world that you're trying to be immersed in.
Also, realize that low framerates have an impact on not only the visual experience of a game, it can also have negative impacts on the control you have in the game. Mouse inputs and keyboard inputs can be slow/not smooth when your framerate dips.
EDIT: I also want to add, if you watch a film before they add all the post processing to make the film appear smooth, you'll see what I'm speaking of. It will look like you're watching a colorful flipbook, for lack of a better term.
I'm considering writing a forum post or editorial about frame rate (low to the very high) judder, motion blur etc as there's a lot of misconceptions about this and I want people to understand the subject. :)
I think that Nvida and AMD and game makers are all in cahoots with each other in order to get our money from us ;)
I want to stress that I'm not pointing fingers here at anyone in particular, it's just what I've generally experienced when writing articles like this.
I've got so much on nowadays, that I don't know if I'll get round to it any time soon, so best not to hold your breath! :)
As an example, without a cap at 59FPS, BF 3 is not that smooth even though it hits 70 or 90FPS. Once I set the limitation in place, all is good. At least on the PC, if a setting gives you trouble, it can be left out and after a while, with a new hardware, it can be used and by that, the game has a new life. On a console, it would be left out from the start.
I game at 120Hz with LightBoost on and it's awesome. :rockout: Yes, the difference is highly visible.