• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4070 Ti Super or 4080 Super

  • Thread starter Thread starter Deleted member 237269
  • Start date Start date

Nvidia or AMD


  • Total voters
    56
  • Poll closed .
Status
Not open for further replies.
Issue is when there isn't really that much competition, especially at the top end.

If you want / need as much gpu power as you can get, then there is only one product really, which the price clearly reflects. And the fact that it's due to lack of competition is clear when you look at last gen - 6900xt was very close to 3090 in performance, thus 3090 was only slightly more expensive... not nearly (or in some cases literally) twice as expensive as we are seeing now with the 4090 vs 7900xtx.

Although at the start of the generation it was much closer 999 vs 1599 which was similar at least here in the states to the 6900XT and 3090 999/1499.
 
I guess people forget when the 5000 cpu series launched AMD wanted 300 usd for a 6 core literally the minute they got ahead and it wasn't till Alderlake released that they graced us with more affordable options.

I remember :D

Capture.PNG
 
Although at the start of the generation it was much closer 999 vs 1599 which was similar at least here in the states to the 6900XT and 3090 999/1499.

Well, it starts at the top, but it affects the entire product stack. Nvidia basically slapped one number higher on all chips vs 3000 series. Where the chip you would have expected in a 4050 instead became the 4060, the chip you would have expected in the 4060 ti instead became the 4070 ti, etc etc.

I reckon it's the biggest price hike on the actual chips being sold since kepler, when they came up with the titan bs branding, and the big chip was suddenly reserved for that, and the 680 got a wimpy 104 chip, but for the same price as the previous x80 cards.

And precisly with the x80 card you could tell that nvidia actually had competition last gen - the x80 card (3080) got the big chip for the first time since... 580 !

Now with 4000 series the 4080 is back to getting what is actually a midtier chip... which would have been ok, if had been priced accordingly (it's being sold at the same price as a freaking 2080 ti, which was already terrible value !), and if there had been a 4080 ti at a reasonable price with the big chip, as there had been with the previous gens... but alas, this is the worst value gen, ever.
 
If I were to get a card today, it would have been the 4070Ti Super due to the 16gb. Not to mention it does appear like a solid performer in games. Not the best but a strong one. 7900 xtx I would have considered if the price dropped by a $100 at least.

Never seen price for a 5600x like that. Even during the pandemic. I bought the 5800x for $400. You guys must have been screwed pretty good during that time.
 
It is safe to say that 4080 super will be 5% faster than 4080 and match 7900xtx. Specs are known so there will be no surprises there.
4070ti super is a little bit different. Core specs-wise it should be 7% faster but it has now 256bit bus so it doesn't have standart versions' weakness on high resolutions. Adding that should make it just faster than 7900xt (maybe 2-3%) at 1440p+.
These results would also fit in Nvidia's pricing. So 4070ti super will be better in price/performance and you ll need to pay extra for premium 4080 super. And AMD will cut 50$ from 7900s to stay competitive.
Ada got cache increased 10-16 times compared to Ampere, so the bandwidth is not a problem. Besides most pc gamers don't care and don't use 4K. 4070 Ti matches 3090 at 4K gaming still. 192 bit 12GB vs 384 bit 24GB.

The leaks so far, shows that 4070 Ti Super performs 7-10% better which is just as expected, the bump to 256 bit and 16GB vRAM does close to nothing

And 1440p is not a "high res", not even 3440x1440 is considered high res imo, 4K and up is high res and this is where the best high-end cards makes sense unless you use upscaling, but even 4090 can struggle at native 4K and 4090 is simply much faster than any AMD card for 4K+ gaming

Cutting 50 dollars from 7900XT is not going to do it, needs to be at least 100-150 dollars cheaper than 4070 Ti Super if they expect it to sell

You didn't specify your use case. ;) But if your sig is correct and you already own a 7900 XTX a 4070 TI Super would be quite a downgrade. Even a 4080 will be a downgrade in raster performance.

4070 TI Super (for 1080p/1440p/120FPS)
4080 (for 1440p/UltraWide/4K/144+FPS, might be the best bang for buck on "sell off" discount)
4080 Super (for 1440p/UltraWide/4K/144+FPS)

So, why exactly the switch? Why not get a replacement 7900 XTX & wait for the next gen?

4070 Ti Super 16GB for 1080p, you joking I hope, even 4070 would be overkill

7900XTX is beat by 4080 in many games, depends on game selection, anyway people are buying Nvidia because of features outside of pure raster and features are exactly where AMD needs to improve bigtime

DLSS and DLAA easily beat FSR and features like this are used by tons of people in raster gaming

I am going back to Nvidia myself from 7900 XT because of issues (lack of driver focus in the games I play) and lacking features in general. Raster performance is my last concern going forward. Upscaling, RT perf and features in general will matter much more for me.

The fact you only look at raster perf in 2024 tells me you don't really know the direction gaming is going in. They are now using upscaling as anti aliasing in recent and new games. Upscalers replaced AA and Nvidia have a clear edge here. Also, many games incorporate RT elements like Avatar and Metro Exodus EE. More and more games will have RT elements going forward.
 
Last edited:
If I deleted every post in this thread that wasn't about the OP's question of 4070ti super versus 4080 super, we'd have maybe two pages of discussion.

The OP is interested in the 4070ti super or 4080 super. No need to bring in AMD at all.

I'm feverishly awaiting the reviews of the 4070 Ti Super and 4080 Super


I'll LQ posts (as I have below) if people insist on talking about other cards.
 
Last edited:
Low quality post by adilazimdegilx
Ada got cache increased 10-16 times compared to Ampere, so the bandwidth is not a problem. Besides most pc gamers don't care and don't use 4K. 4070 Ti matches 3090 at 4K gaming still. 192 bit 12GB vs 384 bit 24GB.

The leaks so far, shows that 4070 Ti Super performs 7-10% better which is just as expected, the bump to 256 bit and 16GB vRAM does close to nothing

And 1440p is not a "high res", not even 3440x1440 is considered high res imo, 4K and up is high res and this is where the best high-end cards makes sense unless you use upscaling, but even 4090 can struggle at native 4K and 4090 is simply much faster than any AMD card for 4K+ gaming

Cutting 50 dollars from 7900XT is not going to do it, needs to be at least 100-150 dollars cheaper than 4070 Ti Super if they expect it to sell

4070Ti losing ground against both 4080 and 7900XT shows that to me increased cache doesn't help enough. And we should also consider that, traditionally, Nvidia does better than AMD in 1440p and higher resolutions so it's even worse for them to lose grounds against rival card. Moreover, seeing the same results are repeatable in pretty much every media outlet easily convinces me more that its 192bit just doesn't cut it. I mean you obviously looked at charts too since you mentioned 4070TI and 3090 being identical at 4K. But I don't know why you omit that 4070Ti is actually faster than 3090 at both 1440p (7%) and 1080 (10%) in the charts on the same page. Some specific results goes even worse than that. For example in Elden Ring at 1080p 4070Ti easily beats 3090 (16%) and 3090Ti (10%) yet it actually gets beaten by 3090! (2%) at 4K. That's losing 18% performance just with increased render resolution and 4080 only loses about 4% in same game against 3090 so this obviously not an architectural issue either.

I don't want to go into what is high res or not, but isn't that even worse for 4070 series if you don't consider 1440p native as high res? Because it is already starting to lose ground at those render resolutions.

I already said that new core specs will give 7% increase in performance, 4070 Super results indicate that, but I need to add that Nvidia didn't bump the TDP with Ti Super like it did with 4070 Super so 7% is actually the best case scenario with increases core counts alone for Ti Super. Any improvement above that would be with the help of the higher bus width so we will see. I also need to add that if gets only 7% it actually cannot beat the 7900XT in raster which would be a shame really. It needs 10% to match it. Yet I expect it to beat it 2-3% with the help of the 256bit bus.

Lastly 7900xt is already selling at 709$ (again) so that's already about 100$ than 4070Ti Super MSRP. I don't know how we are not in the same page on that.
 
Low quality post by Nekajo
"Looking at our test results, the 12 GB VRAM config is perfectly sufficient for all games except Alan Wake 2 RT in 4K, which runs at sub-30 FPS, even with more memory. This means that you'll have to turn on DLSS upscaling either way, which lowers the game's render resolution and the VRAM requirements accordingly."

A single game at 4K native (that ran out of conventional performance too).
Yeah this is exactly what I'm saying. Pushing any current gen mid-end or higher card into having vRAM issues will make the GPU buckle anyway. vRAM won't save you going forward.

Alan Wake 2 is pretty much optimized to use upscaling too. Running it at native UHD makes no sense at all, especially not with path tracing which will make even 4090 struggle hard.

12-16GB is plenty for 1440p-2160p and I doubt this will change any time soon, maybe in 4 years or 2 generations we will start seeing games pushing 16GB at 1440p on max settings but you will need the GPU power to run it as well, vRAM alone won't save you here.

The only reason 7900XTX and 4090 has 24GB is because its 384 bit bus. Both AMD and Nvidia knew it was pure overkill, for gaming that is.

4070Ti losing ground against both 4080 and 7900XT shows that to me increased cache doesn't help enough. And we should also consider that, traditionally, Nvidia does better than AMD in 1440p and higher resolutions so it's even worse for them to lose grounds against rival card. Moreover, seeing the same results are repeatable in pretty much every media outlet easily convinces me more that its 192bit just doesn't cut it. I mean you obviously looked at charts too since you mentioned 4070TI and 3090 being identical at 4K. But I don't know why you omit that 4070Ti is actually faster than 3090 at both 1440p (7%) and 1080 (10%) in the charts on the same page. Some specific results goes even worse than that. For example in Elden Ring at 1080p 4070Ti easily beats 3090 (16%) and 3090Ti (10%) yet it actually gets beaten by 3090! (2%) at 4K. That's losing 18% performance just with increased render resolution and 4080 only loses about 4% in same game against 3090 so this obviously not an architectural issue either.

I don't want to go into what is high res or not, but isn't that even worse for 4070 series if you don't consider 1440p native as high res? Because it is already starting to lose ground at those render resolutions.

I already said that new core specs will give 7% increase in performance, 4070 Super results indicate that, but I need to add that Nvidia didn't bump the TDP with Ti Super like it did with 4070 Super so 7% is actually the best case scenario with increases core counts alone for Ti Super. Any improvement above that would be with the help of the higher bus width so we will see. I also need to add that if gets only 7% it actually cannot beat the 7900XT in raster which would be a shame really. It needs 10% to match it. Yet I expect it to beat it 2-3% with the help of the 256bit bus.

Lastly 7900xt is already selling at 709$ (again) so that's already about 100$ than 4070Ti Super MSRP. I don't know how we are not in the same page on that.

4070 Ti performs like a 3090 Ti at 1440p and a 3090 at 2160p when you look at the overall picture. Considering that 4070 Ti uses much less power, almost half of 3090 Ti, I don't really see this as an issue for most PC gamers, which don't use 2160p to begin with. Also you get full DLSS 3 support and Frame Gen on top.


Lets have a look at Techpowerups recent AAA game testing, seems to me that Ada is only getting faster

4070 Ti beats 3090 in Avatar 4K Ultra

4070 Ti beats 3090 in Alan Wake 2 4K Ultra

4070 Ti beats 3090 in Lords of the Fallen 4K Ultra

4070 Ti very easily beats 3090 in AC Mirage 4K Ultra and even beats 3090 Ti as well in 4K


192 bit bus and 12GB vRAM seems to beat 384 bit and 24GB vRAM in all these games, the 3090 is having problems because of lacking GPU power really

On top, 4070 Ti even has option for DLSS 3 full feature support and Frame Gen which 3090 don't support.

I'd personally not buy either for native 4K gaming but if you offered me to choose between these cards, I'd choose 4070 Ti for sure over 3090. I will take a faster GPU with newer features and much lower power usage over "more vRAM" - Personally I don't care much about 4K gaming, 1440p is still the absolute sweet spot for me.

If I were a 4K gamer I'd buy 4090.
 
Last edited:
I guess we are getting even further away from each other. And obviously people don't like what we are talking about so this will be my last reply @Nekajo

I'm not comparing 3090s and 4070s to see which is better. Obliviously, I would prefer the 4070Ti against them too. I'm only comparing them to show that 192bit bus width (not 12GB VRAM, it's unrelated) holds 4070Ti back. And now that Nvidia giving it 256bit, it should be in better shape than 4070Ti was relatively against the competition, especially in 1440p and 4k.

In every benchmark you linked same thing holds, 4070s loses ground against 3000s and 7900s in higher resolutions. Just look at the Avatar charts and compare 4080 vs 4070Ti and 4070 vs 3080 10GB. They get destroyed at 4K and 4K only. And this is not a VRAM size issue (3080 has less VRAM than 4070 after all), this is (most likely)192bit bus width holding them back. I need to add I didn't speak of VRAM capacities so far nor about software advantages. I don't know why need to include them in your reply.
 
First leaks of 4070 Ti Super shows a ~8% improvement vs 4070 Ti - So yeah might be going one notch up to 4080 Super

Bus width matters as resolution increases but 1440p is not a high res these days, including 3440x1440. 4070 Ti is not getting destroyed in any game. Just because you loose 10% performance going from 1440p to 2160p does not mean the card struggle at all and most people with 4K monitors will be using DLSS probably, or eventually will, since 4090 is the only true 4K+ card this gen (for demanding AAA games) and AMD has nothing that even comes close at this resolution

Reviews going live today or tomorrow? I need to put my 7900 XT up for sale soon I guess, leaning towards 4080 Super to get the full AD103 die and the higher speed memory

Please update the non-SUPER fps numbers before you compare them, or atleast mention it in the review. Drivers increased performance since release of 4000 series for sure. Seen it in other testing.

I'm ready to pull the trigger as soon as the reviews go live :slap:
 
Last edited:
I can see quite a few 8/12/16 GB cards beating 11/16/24 GB cards.
View attachment 330913

Damn, even the 8GB 4060 Ti is beating the 16 GB one?

Remember kids, allocation ≠ usage.
Overall, having "12GB" isn't causing any problems today, but just looking at average framerate is a very weak way to analyze this.

Sometimes the average remains high, But stutters appear during the gameplay or rendering bugs due to aggressive memory management fighting the lack of Vram.

Again... I'm not saying this is the case, just that average doesn't prove anything.
 
Ada got cache increased 10-16 times compared to Ampere, so the bandwidth is not a problem. Besides most pc gamers don't care and don't use 4K. 4070 Ti matches 3090 at 4K gaming still. 192 bit 12GB vs 384 bit 24GB.

The leaks so far, shows that 4070 Ti Super performs 7-10% better which is just as expected, the bump to 256 bit and 16GB vRAM does close to nothing

And 1440p is not a "high res", not even 3440x1440 is considered high res imo, 4K and up is high res and this is where the best high-end cards makes sense unless you use upscaling, but even 4090 can struggle at native 4K and 4090 is simply much faster than any AMD card for 4K+ gaming

Cutting 50 dollars from 7900XT is not going to do it, needs to be at least 100-150 dollars cheaper than 4070 Ti Super if they expect it to sell



4070 Ti Super 16GB for 1080p, you joking I hope, even 4070 would be overkill

7900XTX is beat by 4080 in many games, depends on game selection, anyway people are buying Nvidia because of features outside of pure raster and features are exactly where AMD needs to improve bigtime

DLSS and DLAA easily beat FSR and features like this are used by tons of people in raster gaming

I am going back to Nvidia myself from 7900 XT because of issues (lack of driver focus in the games I play) and lacking features in general. Raster performance is my last concern going forward. Upscaling, RT perf and features in general will matter much more for me.

The fact you only look at raster perf in 2024 tells me you don't really know the direction gaming is going in. They are now using upscaling as anti aliasing in recent and new games. Upscalers replaced AA and Nvidia have a clear edge here. Also, many games incorporate RT elements like Avatar and Metro Exodus EE. More and more games will have RT elements going forward.

Someone fully bought all the nvidia marketing slides !
 
I tried. Thread locked.

Here's another one. :banghead:

 
Status
Not open for further replies.
Back
Top