Wednesday, November 20th 2019

NVIDIA Readying GeForce RTX 2080 Ti SUPER After All?

NVIDIA could launch a "GeForce RTX 2080 Ti Super" after all, if a tweet from kopite7kimi, an enthusiast with a fairly high hit-rate with NVIDIA rumors is to be believed. The purported SKU could be faster than the RTX 2080 Ti, and yet be somehow differentiated from the TITAN RTX. For starters, NVIDIA could enable all 4,608 CUDA cores, 576 tensor cores, and 72 RT cores, along with 288 TMUs and 96 ROPs. Compared to the current RTX 2080 Ti, the Super could get faster 16 Gbps GDDR6 memory.

It's possible that NVIDIA won't change the 352-bit memory bus width or 11 GB memory amount, as those would be the only things stopping the card from cannibalizing the TITAN RTX, which has the chip's full 384-bit memory bus width, and 24 GB of memory. Interestingly, at 16 Gbps with a 352-bit memory bus width, the RTX 2080 Ti Super would have 704 GB/s of memory bandwidth, which is higher than the 672 GB/s of the TITAN RTX, with its 14 Gbps memory clock. These design choices would ensure NVIDIA has a sufficiently faster product than the RTX 2080 Ti, without an increase in BOM, provided it has enough perfectly-functional "TU102" inventory to go around. There's no word on availability, although WCCFTech predicts a CES 2020 unveiling.
Sources: kopite7kimi (Twitter), WCCFTech
Add your own comment

139 Comments on NVIDIA Readying GeForce RTX 2080 Ti SUPER After All?

#26
notb
lynx29yep Nvidia will be forced to respond to navi 2 with Ampere this summer or spring. buying this would be dumb, just another case of Nvidia milking people with too much money, but hey if they are dumb enough to keep buying it more power to nvidia. personally I'd rather be poor and have my brains.
Milking?
They're providing the fastest gaming GPU at particular moment.

Why is that "milking" and what does it have to do with Ampere which is a distant future?
Posted on Reply
#27
xkm1948
lynx29because it isn't full redesign... this is only half new architecture... Navi 2 is a full new design... there will be performance gains.
oh gawd. How do you know? Secret nvidia GPU engineer?
Posted on Reply
#29
skizzo
RavenmasterThis card will be an absolute waste of time if it doesn’t include a HDMI 2.1 socket
exactly what I am waiting for. i am currently running RX 5700 XT and will not upgrade until a next gen GPU with HDMI 2.1 ports on it for the ability to do 4K 120Hz
fynxerwccftech.com/game-ready-driver-441-08-out-now-adds-reshade-filters-hdmi-2-1-vrr-image-sharpening-g-sync-with-ultra-low-latency-rendering/
:banghead:

HDMI 2.1 FEATURES is not the same as having the bandwidth that 4K 120Hz+ needs. many GPUs and TVs have HDMI 2.1 features, doesn't mean it can do the high resolution with high refresh rate over HDMI 2.0 connections. I can't believe how many people wrongly connect these things
Posted on Reply
#30
bug
lynx29because it isn't full redesign... this is only half new architecture... Navi 2 is a full new design... there will be performance gains.
If "a full new design" == "tweaked RDNA", then yes.
I mean, sure Zen2 is significantly better than Zen, so there's no reason RDNA2 won't be significantly better than RDNA. But neither is "a full new design", let's choose our words more carefully, shall we?
Posted on Reply
#31
Space Lynx
Astronaut
bugIf "a full new design" == "tweaked RDNA", then yes.
I mean, sure Zen2 is significantly better than Zen, so there's no reason RDNA2 won't be significantly better than RDNA. But neither is "a full new design", let's choose our words more carefully, shall we?
I was only quoting what I remember reading from Lisa Su. Pretty sure she said full RDNA2 is a different beast entirely than Navi 1. but sure thing
Posted on Reply
#32
EarthDog
lynx29I was only quoting what I remember reading from Lisa Su. Pretty sure she said full RDNA2 is a different beast entirely than Navi 1. but sure thing
We can be wrong... but, you should provide links to support your assertion(s) when questioned. It's how forums work.
Posted on Reply
#33
Flanker
dj-electricI also think that "NVIDIA will be forced" is a term that didn't really happen since... a long time ago.
If I remember correctly, it was the release of HD4850 and HD4870 wasn't it? It brutalized GTX260 and GTX280. That was when I was about to finish high school lol
Posted on Reply
#34
Ravenmaster
Exactly. Although it’s nice that my LG TV does G-sync perfectly, at the same time it’s still capped to 4K 60hz or 1440
skizzoexactly what I am waiting for. i am currently running RX 5700 XT and will not upgrade until a next gen GPU with HDMI 2.1 ports on it for the ability to do 4K 120Hz



:banghead:

HDMI 2.1 FEATURES is not the same as having the bandwidth that 4K 120Hz+ needs. many GPUs and TVs have HDMI 2.1 features, doesn't mean it can do the high resolution with high refresh rate over HDMI 2.0 connections. I can't believe how many people wrongly connect these things
Exactly. The G-sync compatibility for my LG TV is really cool and works impeccably well but at the same time, I’m stuck between having to choose between 4K 60hz or 1440p 120hz because the HDMI 2.0 socket on the GPU is restricting the necessary bandwidth for 4K 120hz. My tv is ready for 4K 120hz + HDR + g-sync but my 2080Ti is holding holding it back.
Posted on Reply
#35
Space Lynx
Astronaut
EarthDogWe can be wrong... but, you should provide links to support your assertion(s) when questioned. It's how forums work.
it was on a powerpoint slide during her presentation from what I remember, but I am not going through that whole video again to find it. I could care less if you agree with me or not, that is also how forums work. later gators
Posted on Reply
#36
EarthDog
lynx29it was on a powerpoint slide during her presentation from what I remember, but I am not going through that whole video again to find it. I could care less if you agree with me or not, that is also how forums work. later gators
What, your post cast aside because you can't support it? Indeed, that is how it will work.
Posted on Reply
#37
Space Lynx
Astronaut
EarthDogWhat, your post cast aside because you can't support it? Indeed, that is how it will work.
for you and some others, not for everyone. /shrug

also i don't really care... which is also how forums work.
Posted on Reply
#38
tomc100
If they can reduce the energy consumption for multi-monitor setups then I'd get it since I rarely play games but use my computer mostly for multi-tasking with a tv and monitor. TV for youtube and monitor for surfing or email.
Posted on Reply
#39
bug
EarthDogWe can be wrong... but, you should provide links to support your assertion(s) when questioned. It's how forums work.
You don't need a link for that.
No CEO ever took the stage to tell us: our upcoming product will be a a tweaked current product on a slightly better production node. Their presentations always include "radically new design", "unprecedented performance", "designed for the next gen" and other terms that will win you a bullshit bingo in no time. That makes me pretty sure Ms Su has said the same thing about RDNA2. People taking said statements at face value, that's another story.
Posted on Reply
#40
EarthDog
bugYou don't need a link for that.
No CEO ever took the stage to tell us: our upcoming product will be a a tweaked current product on a slightly better production node. Their presentations always include "radically new design", "unprecedented performance", "designed for the next gen" and other terms that will win you a bullshit bingo in no time. That makes me pretty sure Ms Su has said the same thing about RDNA2. People taking said statements at face value, that's another story.
That's a good point... in particular your last sentence (which was next to address when a link was provided).

Hook, line, and sinker, some. :)
Posted on Reply
#41
kings
lynx29because Lisa Su said Navi 2 is going to be a Nvidia giant killer. so that for one...
Yeah, like they said Fury X was going to be the overclockers dream. Or the "Poor Volta" thing...

AMD always has something fantastic up its sleeve, the problem is that the sleeve is so long that we wait, we wait, we wait...
Posted on Reply
#42
xkm1948
kingsYeah, like they said Fury X was going to be the overclockers dream. Or the "Poor Volta" thing...

AMD always has something fantastic up its sleeve, the problem is that the sleeve is so long that we wait, we wait, we wait...
They have great presentation slides I will give you that.





Posted on Reply
#43
Space Lynx
Astronaut
kingsYeah, like they said Fury X was going to be the overclockers dream. Or the "Poor Volta" thing...

AMD always has something fantastic up its sleeve, the problem is that the sleeve is so long that we wait, we wait, we wait...
really enjoying my Ryzen 3600 and 3800 1:1 inifnity fabric ram at the moment. hard to believe I am getting better scores than even a 9700k and the ram and CPU and mobo all cost me less than a single 9700k to boot. turns out ryzen loves ram if you max out inifnity fabric, it screams.

but your right, we wait and wait and wait. I'm glad I waited.
Posted on Reply
#44
HD64G
Good news as prices will go down. Since all nVidia Super GPUs released days before an equivalent part from AMD released, I suppose that the green team knows something about a bigger than Navi 10 chip able to compete with 2080Ti.
Posted on Reply
#45
Space Lynx
Astronaut
HD64GGood news as prices will go down. Since all nVidia Super GPUs released days before an equivalent part from AMD released, I suppose that the green team knows something about a bigger than Navi 10 chip able to compete with 2080Ti.
aye, it's great there is competition again. I still can't believe how cheaply I built my new rig... and how powerful it is in regards to how much I paid. quite amazing
Posted on Reply
#46
Space Lynx
Astronaut
RH92No no secret engineer he has the special power called '' blind fanboy sense '' :roll:
Funny, I owned Intel Nvidia for last 6 years... but sure, fanboy sure...
Posted on Reply
#48
Th3pwn3r
notbMilking?
They're providing the fastest gaming GPU at particular moment.

Why is that "milking" and what does it have to do with Ampere which is a distant future?
In a way they are and in a way they're not. Nvidia doesn't have to create a 2080ti super since they're the top dog by far but they're still going to give people a slight bump in performance. It's a filler until next gen basically.
Posted on Reply
#49
notb
lynx29Funny, I owned Intel Nvidia for last 6 years... but sure, fanboy sure...
You can be a fanboy and remain a rational buyer. So you bought the better GPU. That's it.
Most AMD fanboys used Intel CPUs for better part of the last decade, but activated once Zen came out.
Being a fan of AMD (or just anti Intel) is one thing, but having a usable PC is important as well. :)

Anyway, since you've already admitted that you don't really care what we think about your posts, there's likely no reason for you to tell the truth in them. You don't care. You might as well be making everything up. :)
Th3pwn3rIn a way they are and in a way they're not. Nvidia doesn't have to create a 2080ti super since they're the top dog by far but they're still going to give people a slight bump in performance. It's a filler until next gen basically.
So, they could basically go on vacation and keep selling 2080Ti until AMD announces something that beats it.
Instead they launch a new card that offers even more performance.

How exactly is that bad in any way? Shouldn't we praise this?
Posted on Reply
#50
Space Lynx
Astronaut
notbYou can be a fanboy and remain a rational buyer. So you bought the better GPU. That's it.
Most AMD fanboys used Intel CPUs for better part of the last decade, but activated once Zen came out.
Being a fan of AMD (or just anti Intel) is one thing, but having a usable PC is important as well. :)

Anyway, since you've already admitted that you don't really care what we think about your posts, there's likely no reason for you to tell the truth in them. You don't care. You might as well be making everything up. :)

So, they could basically go on vacation and keep selling 2080Ti until AMD announces something that beats it.
Instead they launch a new card that offers even more performance.

How exactly is that bad in any way? Shouldn't we praise this?
alternated intel and amd since the original pentium came out in the mid 90's. not a big deal to me, just whoever offers best cost for performance.

neat, everything I type is made up. cool line of logic you have there. highly intelligent.
Posted on Reply
Add your own comment
Dec 18th, 2024 05:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts