• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX SUPER Lineup Detailed, Pricing Outed

I'm just curious, how do you figure that a 50% larger chip at similar clocks would only consume 11% more power?
He doesn't figure anything, but since he set out to paint a rosy picture of Navi, there simply wasn't room for a bigger number there.
 
I'm just curious, how do you figure that a 50% larger chip at similar clocks would only consume 11% more power?
How much would it consume in your opinion?
 
How much would it consume in your opinion?
Let's see, RTX 2080's die size is 1.7x the size of RTX 2060's. That pushes the TDP from 160W to 250W (1.6x).
 
Well, there is zero reason for Nvidia to be selling any cheaper as even the "lowly" 2060 super will smash the best Navi will have to offer and when properly OCed will rival even the Crapeon 7 so...take your blames with AMD for not being able to compete at all for the last 5 years.
 
Well, there is zero reason for Nvidia to be selling any cheaper as even the "lowly" 2060 super will smash the best Navi will have to offer and when properly OCed will rival even the Crapeon 7 so...take your blames with AMD for not being able to compete at all for the last 5 years.
Yeah, it's got to the point I have more hopes in Intel to put pressure on Nvidia for the next few years.
 
How much would it consume in your opinion?
It would be just a guess, but considering RX 5700 XT consumes 225W and the slightly higher clocked anniversary edition consumes 235W, it's a good indicator that AMD have pushed the chip beyond the "sweetspot". So unless a ~50% larger chip runs at much lower clocks, I would assume anywhere from ~275-325W, especially if they put a 384-bit memory controller on it.
 
Yeah, it's got to the point I have more hopes in Intel to put pressure on Nvidia for the next few years.
Yeah, I mean they aren't about to close in on Nvidia anytime soon, but I'm willing to bet they will overtake puny RTG by 2021 after which point they will probably be relegated to APUs only as history shows that no more than 2 major players can survive. And I suspect it won't be much better in the CPU space by then as Intel will probably start rolling out their 7nm chips that will likely obliterate AMD's 5nm an even 3... Team red fanbois should cherrish these days as they won't last...
 
Yeah, I mean they aren't about to close in on Nvidia anytime soon, but I'm willing to bet they will overtake puny RTG by 2021 after which point they will probably be relegated to APUs only as history shows that no more than 2 major players can survive. And I suspect it won't be much better in the CPU space by then as Intel will probably start rolling out their 7nm chips that will likely obliterate AMD's 5nm an even 3... Team red fanbois should cherrish these days as they won't last...
I wouldn't make those predictions, there are too many variables at play here.

It would be just a guess, but considering RX 5700 XT consumes 225W and the slightly higher clocked anniversary edition consumes 235W, it's a good indicator that AMD have pushed the chip beyond the "sweetspot". So unless a ~50% larger chip runs at much lower clocks, I would assume anywhere from ~275-325W, especially if they put a 384-bit memory controller on it.
It's possible the anniversary edition is just the TDP set higher to allow for more legroom, not necessarily an indicator the XT is past the sweetspot.
The real problem we know of is price. Even if Navi could be made into bigger chips, we'd still be in the uninteresting $600+ territory.
 
Let's see, RTX 2080's die size is 1.7x the size of RTX 2060's. That pushes the TDP from 160W to 250W (1.6x).

And that is not including the fact that at least 15% of the die is not dedicated to raw gaming performance, and barely weighs in on the TDP budget. This works against Navi, as size goes up, I reckon it will fall short with 250W. An interesting calculation then would be: how large would a shrunk down 7nm 2080 be. Im too lazy for that atm :p
 
You mean hardware that was used to accelerate something which was already implemented in software for years by that point in time ? Yeah that's definitely the same thing. Peace out!
Not the same, similar as no games to use T&L Gwforce 256 offered in august 99. Come in peace!

Vayra, no shrinking as AMD like to do with old tech. Ampere will be 7nm or 5nm and new tech. Well, AMD patented something hybrid, Papermaster is good at papers.
 
And that is not including the fact that at least 15% of the die is not dedicated to raw gaming performance, and barely weighs in on the TDP budget. This works against Navi, as size goes up, I reckon it will fall short with 250W. An interesting calculation then would be: how large would a shrunk down 7nm 2080 be. Im too lazy for that atm :p
Yeah, they could probably fabricate a "monster" die size Navi which would land somewhere between 2080 and 2080Ti while guzzling upwards of 300W and pretty much need watercooling out of the box, just like Fury X, but unlike the latter (which was also a failure, but at least remained somewhat competitive for a while - untill you OCed the 980Ti of course, lol), it would be rendered obsolete almost imediately by the 2080 super.
 
Not the same, similar as no games to use T&L Gwforce 256 offered in august 99. Come in peace!

Vayra, no shrinking as AMD like to do with old tech. Ampere will be 7nm or 5nm and new tech. Well, AMD patented something hybrid, Papermaster is good at papers.

Architecture development has been iterative since forever. Define 'new tech' please... are they going to ditch CUDA? :p Since Kepler we haven't seen a new architecture from Nvidia, as in, no radical changes. Turing is also not a huge change, but adds new bits to what was already there.
 
Fury X power consumption is only 10-15% higher than that of 980Ti, contrary to what you seem to think
The 980Ti has 50% more VRAM (and not of the HBM variety). The difference is greater if you compare just the GPUs.
 
Vayra, true. But, but, 480, 580, 590.
 
Not the same, similar as no games to use T&L Gwforce 256 offered in august 99.

Let me put it in another way, people were already implementing the exact same things in game engines, just not for dedicated hardware. Now let's see, how many engines were doing software ray-tracing before DXR ?

You can't get 3D graphics on screen without projection matrices and whatnot with or without hardware acceleration. Explain to me how this stuff wasn't the same before the amazing people at Nivida released hardware T&L.

Vayra, no shrinking as AMD like to do with old tech.

Pascal was pretty much a straight up shrink, change my mind.
 
Last edited:
Let me put it in another way, people were already implementing the exact same things in game engines, just not for dedicated hardware. Now let's see, how many engines were doing software ray-tracing before DXR ?

All of them, if they could. Unlike TnL, RTRT cannot be done in software with the silicon we have today.


Pascal was pretty much a straight up shrink, change my mind.

It's impossible to change your mind. If you were open to change/information you would have found this on your own: https://en.wikipedia.org/wiki/Pascal_(microarchitecture)#Details
 
GPUs marketed at hype beast millenials.. SMH
 
It would be just a guess, but considering RX 5700 XT consumes 225W and the slightly higher clocked anniversary edition consumes 235W, it's a good indicator that AMD have pushed the chip beyond the "sweetspot".
Welp:

aewKGY9.png



 
Aand now we have graphs about Navi power draw in a thread about Turing Super. Neat.
 
So? It has released Super cards too thus it's not really off topic.
It doesn't address the original assertion that Navi may be pushed beyond its sweetpot either.
 
So? It has released Super cards too thus it's not really off topic.
So...
AMD 5700 and 5700XT beats nvidia on perf and perf/watt and perf/$.
AMD 5700 beats 2060 Super (and costs 5% less)
AMD 5700XT beats 2070 Super (and costs 10% less)

You were asking?
 
It doesn't address the original assertion that Navi may be pushed beyond its sweetpot either.

Well can't really make that kind of assertions until thorough reviews are out. There's not much details on those leaked power figures either, "whole system gameplay" right that might vary game to game, temp might throttle clocks down etc. etc.

So...
AMD 5700 and 5700XT beats nvidia on perf and perf/watt and perf/$.
AMD 5700 beats 2060 Super (and costs 5% less)
AMD 5700XT beats 2070 Super (and costs 10% less)

You were asking?

I did ask some input from @bug and got my answer and thanked it. Nothing to do with you
 
Last edited:
Back
Top