Wednesday, October 17th 2018

NVIDIA Readies TU104-based GeForce RTX 2070 Ti

Update: Gigabyte themselves have come out of the gates dismissing this as a typo on their part, which is disappointing, if not unexpected, considering that there is no real reason for NVIDIA to launch a new SKU to a market virtually absent of competition. Increased tiers of graphics card just give more options for the consumer, and why give an option that might drive customers away from more expensive graphics card options?

NVIDIA designed the $500 GeForce RTX 2070 based on its third largest silicon based on "Turing," the TU106. Reviews posted late Tuesday summarize the RTX 2070 to offer roughly the the same performance level as the GTX 1080 from the previous generation, at the same price. Generation-to-generation, the RTX 2070 offers roughly 30% more performance than the GTX 1070, but at 30% higher price, in stark contrast to the GTX 1070 offering 65% more performance than the GTX 970, at just 25% more price. NVIDIA's RTX varnish is still nowhere in sight. That said, NVIDIA is not on solid-ground with the RTX 2070, and there's a vast price gap between the RTX 2070 and the $800 RTX 2080. GIGABYTE all but confirmed the existence of an SKU in between.
Called the GeForce RTX 2070 Ti, this SKU will likely be carved out of the TU104 silicon, since NVIDIA has already maxed out the TU106 with the RTX 2070. The TU104 features 48 streaming multiprocessors in comparison to the 36 featured by TU106. NVIDIA has the opportunity to carve out the RTX 2070 Ti by cutting down SM count of the TU104 to somewhere in the middle of those of the RTX 2070 and the RTX 2080, while leaving the memory subsystem untouched. With these, it could come up with an SKU that's sufficiently faster than the GTX 1070 Ti and GTX 1080 from the previous generation, and rake in sales around the $600-650 mark, or exactly half that of RTX 2080 Ti.
Source: Reddit
Add your own comment

29 Comments on NVIDIA Readies TU104-based GeForce RTX 2070 Ti

#26
Th3pwn3r
I bet a lot of people thought anti aliasing was a gimmick too back in the day before we had internet. Ray tracing seems like one of those things you didn't know you needed or wanted until you actually seen it first-hand. Still the prices need to get a lot better.
Posted on Reply
#27
deu
Tsukiyomi91getting a "fresh" Pascal GPU right now isn't going to do justice when developers starts implementing ray tracing to their games. Just no. What happens when games & 3D rendering programs only uses ray tracing as their go-to rendering choice? Your Pascal or Volta card may just be a relic of a silicon. As much as I hated how Nvidia does for their new product lineup, one thing I'll be certain is that next year Pascal cards will flood the used GPU market coz everyone wanted to ride the ray-tracing bandwagon. So it's either you future-proof now or regret later.
Ye lets get a card supporting a technology it in practice cant run! :D Its like getting a 1030T for gaming since it supports 4K :D 2000-series is by no mean futureproof. Its a mix between traditonal shadering and RT. Trust me; no studio will say: well; new technology is out!; lets drop all earlier supported technology resulting in 1/10 of sale due to no users being able to play their games. In the next two years, you will see studios begin to support RT. In 5-10 years you might see a straight up transition to pure RT. To the people believing nvidias marketing machine might be the shizzle but to the rest of us proper baked lighting and "trickery" manages to deliver the backbone of "good" lighting. RT right now is in a weird 'uncanny valley' It works, but in practice it looks gimmickey at best. I am 100% sure RT is the future of lighting in 3D enviroments, but in gaming, this is a few years ahead. Running a game on 2080Ti=1080p@30-60fpsRT vs 1080Ti=1440p@+80fps and ultra settings i know what im going for :)

EDIT: Techpowerup have an article about RT here:

www.techpowerup.com/248649/remedy-shows-the-preliminary-cost-of-nvidia-rtx-ray-tracing-effects-in-performance
Posted on Reply
#28
Vayra86
lexluthermiesterWe're looking at a minimum of 6 or 7 years before that starts happening. RTRT is just way too new. Exciting and a game changer, yes, just too new.
Amen to this. I think that is a very realistic estimate.

As for buying Turing - if you need to replace something older than Maxwell top end, yes. Anyone else: probably not a good idea.
Posted on Reply
#29
lexluthermiester
Vayra86As for buying Turing - if you need to replace something older than Maxwell top end, yes. Anyone else: probably not a good idea.
I have. My 2080 came a few days ago and easily outperforms my 1080 by leaps and bounds. Was it worth the price? Ultimately yeah. I sold my 1080 to a client who's been wanting an upgrade which made the price much more agreeable.
Posted on Reply
Add your own comment
Mar 11th, 2025 04:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts