• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4080 Super Founders Edition

I admit that I do not know the exact current power draw of 7900 XTX (with the newest driver), but it has been a problem for a long time.


When you see the power draw of the 7900 XTX in this video compared to 4080, you may think that if a 4080 can work with 750W PSU, 7900XTX might work OK with 850W, but just ask 7900 XTX owners, what PSU are they using to be able to run the card stable...
I'm running my XTX on a Corsair AX850 Titanium PSU
 
View attachment 332429

See my picture. I cannot assesss professional application, but Nvidia wins there probably too. So 4080 is winning 8:3 or 9:3 and is worth $200 extra, or more.

AMD needs to drop the price of the 7900 XTX significantly.
Uhhhhh, your going to have to define your wording in your chart because your phraseology acts like these cards are miles apart in the categories you chose. Especially considering some spots are just wrong. The metrics seem all over the place so I can't tell how you are defining your wording in certain categories.

For starters, the rasterization performance is better on the 7900XTX so not sure why both are listed like they are exactly the same.
How do you define RT performance being very good on the 4080 yet just adequate on the 7900XTX, I agree the 4080 is higher but the only card out that really does well in ray tracing is the 4090 so what are you using to define that judgement?
The 7900 XTX has lower idle power draw so that line is just plain wrong. Also even if the two were reversed why would a couple of watts be considered that much difference in judgement?
Multi-Monitor and video playback are higher by around 20 watts which is very confusing why you consider one high and the other low.
Then 3D load, again definition because the difference is like 50 watts? Why are we talking very good to adequate?
PSU requirements are also wrong, I mean I see 800watt so not sure where your getting 1000watt + unless your picking one of the extreme editions?

I don't see any major reason to drop the price of the 7900 XTX other then people would like it (I mean I would love it if all the prices dropped). The cheapest 7900 XTX (Not including an open box deal I see for $899) is $949 while the cheapest 4080 Super is $999. So 50 dollars less while performing about 3-4% higher overall in rasterization which is how a majority of people are going to use the cards. Ray tracing is selective and a performance killer on everything, power consumption differences are minimal (When you get 100watts difference on average is really where you might talk about the difference because it can vastly change cooling requirements, power, etc), higher VRAM is generally better if you don't replace your card every year as we have seen in the past (Albeit, both have a good amount), last would really be the frame generation which both have (though Nvidia's has the edge overall).

Totally agree. Realtime Ray Tracing is the most grossly overrated gimmick in today's gaming world. I can name only 2 games where Ray Tracing makes a a little difference in graphics fidelity:
Cyberpunk (not worth the performance impact) and The Ascent (superbly implemented with minimum performance impact). Other than that, is a disabled by default "feature", together with the crappy DLSS. And yes, without RTRT, even a 3080 card can play the latest games with full details and FPS.


Relax my friend. They are making most of the money with their overly expensive Data Center professional cards, not with the gaming ones. Their return of investment is probably around 1000% or more, so no need to cry for them.
Totally agree.

I admit that I do not know the exact current power draw of 7900 XTX (with the newest driver), but it has been a problem for a long time.


When you see the power draw of the 7900 XTX in this video compared to 4080, you may think that if a 4080 can work with 750W PSU, 7900XTX might work OK with 850W, but just ask 7900 XTX owners, what PSU are they using to be able to run the card stable...
Ok dude, we literally have the numbers from the review on here. It is not the much of a difference...

Totally agree. A 2 years old GPU sold for 300-400$ more than it should have been sold, it's by far the worst investment anyone can make at the moment. Is a no brainer.
I really miss high end GPU's being under 1k. It really is pushing PC gaming to be significantly more difficult to keep up with and for people to get into.
 
I really miss high end GPU's being under 1k. It really is pushing PC gaming to be significantly more difficult to keep up with and for people to get into.
Yup I remember buying flagship gpus for $500.

But that was in like 2002 and I was younger and making less money. PC gaming has moved from teens to young adults and older adults with more disposable income.
I would not have been able to afford this current pricing in 2002 now its easier because i make more money.
 
Yup I remember buying flagship gpus for $500.

But that was in like 2002 and I was younger and making less money. PC gaming has moved from teens to young adults and older adults with more disposable income.
I would not have been able to afford this current pricing in 2002 now its easier because i make more money.
I remember buying Dual GPU cards for $699 a decade ago. Heck my R9 290X trio was bough 10 years ago for $549 a piece. I am still amazed we have more than doubled in price in 10 years.
 
The 7900 XTX has lower idle power draw so that line is just plain wrong. Also even if the two were reversed why would a couple of watts be considered that much difference in judgement?
Multi-Monitor and video playback are higher by around 20 watts which is very confusing why you consider one high and the other low.
Then 3D load, again definition because the difference is like 50 watts? Why are we talking very good to adequate?
I just used two words for good and less than good.

You are correct, idle power draw was never bad. On the other hand, video playback and multimonitor power draws were a disaster at launch and are still more than twice worse.

xtx power.png
 
I just used two words for good and less than good.

You are correct, idle power draw was never bad. On the other hand, video playback and multimonitor power draws were a disaster at launch and are still more than twice worse.

View attachment 332498

I love when people cling to the silly power usage argument. Don’t get me wrong, be aware and respectful of energy footprint, but do you watch youtube videos 24hrs a day every day for the entire year?

IMG_5257.jpeg

Even based on this massively unrealistic scenario, if you can’t afford $2 more a month theres no way you’re spending a grand+ on a gpu. Thats before the $100-300 price gap thats long existed between the 7900XTX and 4080 non super, which would take 5+ years to favor the 4080 non super (longer than the standard lifetime of the card). Absolutely irrelevant argument, even with the new “msrp”.
 
I love when people cling to the silly power usage argument. Don’t get me wrong, be aware and respectful of energy footprint, but do you watch youtube videos 24hrs a day every day for the entire year?
What do you mean? I think it is very common that people have more than one monitor. Even I do, and I do not have any high end setup.

FYI electricity cost in Europe is multiple times higher compared to USA.
 
What do you mean? I think it is very common that people have more than one monitor. Even I do, and I do not have any high end setup.

FYI electricity cost in Europe is multiple times higher compared to USA.

Even at triple the electricity cost, under more normal usage, you're taking about 1USD difference per MONTH between the two cards - it’s irrelevant unless we’re talking about using 300w vs 600w or more.
 
....it's a joke to discuss about energy consumption and bills when someone is willing to pay 1000 for a halo product.
There are so many arguments and reasons to go with or to avoid a 4080. But the energy consumption? Really?
The fact that it is efficient is nice but if it were not, it wouldn't make any difference.

Practically, 1000 dollars/pounds/euros is a lot of money and this level of price instantly identifies the gpu as a halo product.
In this category of gpu, only the performance and features matter.
The 4080/S has been and will be a better product than the 7900XTX. There shouldn't even be an argument here.

The price is the only thing that can put these 2 gpus in comparison.
If the 4080/S costs 1000, then the 7900XTX has to go to 800 or lower. If the 4080/S costs 800, again the 7900XTX has to go to 600 or lower.

The Radeons have to face the used Amperes as well. Not only the Adas. I would easily consider buying a 3080Ti/3090 instead of a 7900XT/XTX.
 
Well 7900xtx in my country is 1150$ while i had purchased a 4080 super for 1481$. The reason why i didn't go with amd is that i am using a 4k 144hz monitor and oreviously i had been using 6900xt for it and while playing hogwarts legacy i realized that fsr quality was making the trees and hairs too jaggy so i switched to 3090 and dlss was perfect it was even better than native yet 4k 144hz is to taxing to olay without dlss frame gen so i switched to 4080 super yesterday. I fully support buying an AMD gpu on FHD because dlss will suck as well so you have to resort to rasterization, for 1440p both are equal but for 4k i believe Nvidia is the better choice as dlss is better and dlss balanced gives the same image with fsr quality so i wouldn't mind paying extra for Nvidia as the reason why i went with 4k is the image quality so why sacrifice on that.

If i were gaming on 4K today, i'd bag a 4080S too. I've seen a bunch of reviews where nV upscaling from 1440p>4K delivers some pretty great results, in some games the reviewer suggesting better than native quality output. I can't say the same with my current 1440p panel. So i agree at 4K paying the added premium above the XTX's price-point makes sense, more-so in the long run especially when it's returning real tangible performance/quality rewards.

I was previously referencing the 4090, a price-point which I defo wouldn't touch.
 
I just used two words for good and less than good.

You are correct, idle power draw was never bad. On the other hand, video playback and multimonitor power draws were a disaster at launch and are still more than twice worse.

View attachment 332498
And they improved it in a driver update... The GPU got better with updates like they all do. The AMD cards have more memory and stay clocked higher in those situations for a good reason regarding performance in multi-monitor and playback. Regardless, you are talking about less than 1 USD a month difference on cards that are costing 1k range (Not to mention even less if you factor it in by performance per kilowatt).

You also used words that implied a larger gap. These cards do not have a huge gap in almost anything (most notable would probably be ray tracing). Charts like that implying miles long gaps in cards is bad information to people purchasing a card/talking to others on reviews. Both cards have trade off at the end of the day and neither has a significant lead in any category.

The 4080/S has been and will be a better product than the 7900XTX. There shouldn't even be an argument here.

The price is the only thing that can put these 2 gpus in comparison.
If the 4080/S costs 1000, then the 7900XTX has to go to 800 or lower. If the 4080/S costs 800, again the 7900XTX has to go to 600 or lower.
How so, the overall graph puts the 7900 XTX ahead in overall gaming performance. Saying it needs to be 200 below to be competitive while giving higher performance is just wishful thinking. I think there is an argument for both at this price, but its not cut an dry.
 
And they improved it in a driver update... The GPU got better with updates like they all do. The AMD cards have more memory and stay clocked higher in those situations for a good reason regarding performance in multi-monitor and playback.

AMD repeatedly fixes and breaks the multimonitor and playback power draw, and it's hilarious this is going on for years.

And what do you mean by "performance in multi-monitor and playback" - does it multimonitor more than Nvidia? Is the playback faster?

:-P
 
AMD repeatedly fixes and breaks the multimonitor and playback power draw, and it's hilarious this is going on for years.

And what do you mean by "performance in multi-monitor and playback" - does it multimonitor more than Nvidia? Is the playback faster?

:p
Probably should have phrased it better, I meant they choose to keep the ram clocked higher to avoid issues with high refresh rate setups. There was an article about it along time ago explaining why they decided across their lineup to keep memory clocked higher. I don't remember everything that was said but I think it has to do with high refresh rates and multiple monitors performance.
 
Anyone in the market for a 4080 probably already bought one so not sure who this is actually for.... People who would never buy an AMD card and were holding out for something better from Nvidia for 1k I guess.

Super Lineup has been pretty meh the 4070 S is still stuck with 12GB and a 600 usd price tag, the 4070ti S is kinda meh and the 4080S is just a price cut. I guess when AMD can only compete at raster with Nvidia's small dies this is the best we get.
 
Better than selling the originals all the way through. still can't get over the fact they cut 4070 TiS L2$ but it kind of makes sense. no free lunch here.
And this is a full chip, excellent. barring the humongous PCB for a tiny chip.
 
GeForce RTX 4080 Super introduces a noteworthy $200 price reduction compared to the non-Super 4080
So at least there's an upside. Not much improvement, but a chunk off the price. Still an ugly card though. The Zotac and PNY models are much more visually appealing.

1% x non super, this must be a joke. I believe they dont want to outshine the next nvidia gpus that are coming this year.
Did you miss the price reduction aspect?
 
... the 4080S is just a price cut.

ca579b_682395080ed94dcca9b1fba6c8885f7c~mv2.jpg


In EU the price of RTX 4080 was roughly just 50 EUR higher before release of RTX 4080 SUPER, and many online stores have "discounted" old 4080 cards - but they are barely discounted below the price of "new, better, faster" 4080 SUPER - because it's basically the same card. But many more are just leaving them at basically the same price, or even trying to sell the old cards for higher price - as I have heard, this low demand is met with equally low volume, and we might see stores with no 4080 SUPER stock, but still plenty of old non super cards. .

Instead of "why are you whining, you're getting almost 20% discount" here it's literally almost nothing.
 
View attachment 332577

In EU the price of RTX 4080 was roughly just 50 EUR higher before release of RTX 4080 SUPER, and many online stores have "discounted" old 4080 cards - but they are barely discounted below the price of "new, better, faster" 4080 SUPER - because it's basically the same card. But many more are just leaving them at basically the same price, or even trying to sell the old cards for higher price - as I have heard, this low demand is met with equally low volume, and we might see stores with no 4080 SUPER stock, but still plenty of old non super cards. .

Instead of "why are you whining, you're getting almost 20% discount" here it's literally almost nothing.

My buddy grabbed one for 999 with tax but yeah other countries get the shaft for sure.

Basically the 4080 Slightly less lube edition.
 
My buddy grabbed one for 999 with tax but yeah other countries get the shaft for sure.

But what were the vanilla RTX prices before the SUPER release?

Here we have seen in those 1.5 years fall from almost 1300 EUR to roughly 1150 EUR for cheap base models, and base models of 4080 SUPER have released at about 1100 EUR. So people still argue that technically it's 150 EUR off, you have to somehow compare prices at release and ignore the price reductions, because "the official MSRP" stayed the same.
 
Last edited:
View attachment 332577

In EU the price of RTX 4080 was roughly just 50 EUR higher before release of RTX 4080 SUPER, and many online stores have "discounted" old 4080 cards - but they are barely discounted below the price of "new, better, faster" 4080 SUPER - because it's basically the same card. But many more are just leaving them at basically the same price, or even trying to sell the old cards for higher price - as I have heard, this low demand is met with equally low volume, and we might see stores with no 4080 SUPER stock, but still plenty of old non super cards. .

Instead of "why are you whining, you're getting almost 20% discount" here it's literally almost nothing.
Short answer, yes it is.
 
Short answer, yes it is.
Enough to call it "chunk"? Or are we just feeling SUPER generous?

Reviews almost universally ignore the fact that you could buy an RTX 4080 below the MSRP, and that there isn't a "$200 off". Do the reviewers just not care, because they don't buy their cards and they don't have to follow the prices, or was this one of the lines in Nvidia review hints that you have to check?
 
Enough to call it "chunk"? Or are we just feeling SUPER generous?
$200 BELOW the non-super 4080 card MSRP? That doesn't seem like a decent chunk of change to you?

Reviews almost universally ignore the fact that you could buy an RTX 4080
That is highly dependant on where and when you look. They're not ignoring the point so much as just not focusing on it. MSRP VS MSRP, the difference is $200. Full stop.
 
$200 BELOW the non-super 4080 card MSRP? That doesn't seem like a decent chunk of change to you?


That is highly dependant on where and when you look. They're not ignoring the point so much as just not focusing on it. MSRP VS MSRP, the difference is $200. Full stop.

Yes, and at RTX 5080 release at $1800 you all will argue that it's completely fair, 50% performance increase for 50%" price increase," yOu hAvE tO diSreGarD pRiCe CuTs! FuLl StOp!"
 
Probably should have phrased it better, I meant they choose to keep the ram clocked higher to avoid issues with high refresh rate setups. There was an article about it along time ago explaining why they decided across their lineup to keep memory clocked higher. I don't remember everything that was said but I think it has to do with high refresh rates and multiple monitors performance.
I MULTI MONITOR MORE THAN YOU DO, PAL! - AMD
 
Back
Top