• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4080 Could Get a Price Cut to Better Compete with RDNA3

That's not how it works. By your logic and going way back, GPUs should probably cost upwards of 3 or 4k today because each generation is faster than the last. Or because Intel released the 9980XE at nearly 2k some years ago, it's reasonable that every cpu that's faster should cost more than 2k.
? My logic is performance that used to cost $1600 now costs $1200, and will probably cost $999 pretty soon, while drawing 200 W less power.
Nvidia decisions have finally led me to the point of not caring about their products no matter of the price.
From me as an individual they will not see another cent.
As opposed to who? AMD? With their 7900XTX card also priced at $999 that by their own words competes with the 4080?
 
Completely reasonable yes, at 1000/1100 it would have price parity with 7900XTX while having more dedicated features like CUDA/RTX, and being roughly the same in raster/efficiency. Everyone is just caught up in xx80 tier expectations instead of looking at where the product sits performance wise.
No, some are confused, they're past purchasing habits have confused them into thinking Any of these prices Were Ever sane, and though AMD follows, Nvidia has set the priced to the f£#@£££ moon trend.

You can buy a family of five a laptop each with the cost of a typical 4090.

A 4080 at this Super new low price could buy two low end gaming laptops, Value it is not.
Tearing new ring pieces is what it is, Still.
 
Regarding the poll and prices, I wouldn't consider any 12HVPWR connector for the foreseeable future. Never had any 6/8 pins melt on me in a decade+. Even if it is a minority of cards, even if it's user error, count me out. I don't want to keep worrying about melting my computer or even burning down my house if I step out. When AMD offers something in the sub 225w range, I'll go red team since HD 7000 series.
Just make sure the connector is plugged in properly. Why is this such a difficult concept for so many people to understand? This is not sarcasm, I'm literally unable to grasp how we've come to a point whereby users being incapable of plugging power cables into computer equipment and/or failing to verify that the connection is mated correctly, is somehow the manufacturer's fault. The level of stupidity astounds me.

I'll make it simple for you: are you a dumbass who is incapable of performing the basic operation of "insert plug A into receptacle B and ensure that clip C is engaged"? If you are, then yes you should avoid the ATX12VHPWR connector... as well as literally every other PC power connector ever invented.

What I'm saying is, if you are stupid enough to cause your RTX 4090 to melt, you shouldn't be building PCs at all.
I don't know what is going on with the 12 pin connectors on the 4090. Maybe they are lower quality.
Nothing to do with connector quality and everything to do with more users being able to afford the $1,600 4090 as opposed to the $2,000 3090 Ti. So just a bigger sample size.
? My logic is performance that used to cost $1600 now costs $1200, and will probably cost $999 pretty soon, while drawing 200 W less power.
You're moving the goalposts again to avoid engaging with the argument presented. It's very obvious and it's not helping your position. If you want to do the latter, engage with the actual argument, or GTFO.
 
Last edited:
You're moving the goalposts again to avoid engaging with the argument presented. It's very obvious and it's not helping your position. If you want to do that, engage with the actual argument, or GTFO.
Really. So by saying that it's faster than the previous gen titan tier card while being cheaper in my first comment, then expanding on that in later comments i'm moving the goalposts, interesting.
 
Really. So by saying that it's faster than the previous gen titan tier card while being cheaper in my first comment, then expanding on that in later comments i'm moving the goalposts, interesting.
And "? My logic is performance that used to cost $1600 now costs $1200, and will probably cost $999 pretty soon, while drawing 200 W less power."



By your logic should we not be paying server prices for CPU.
I mean we're so far into server core counts perhaps 10K is what we should be paying for CPU, no?!.
 
Really. So by saying that it's faster than the previous gen titan tier card while being cheaper in my first comment, then expanding on that in later comments i'm moving the goalposts, interesting.
Hasn't that been the norm in literally every generation previously?
 
By your logic should we not be paying server prices for CPU.
I mean we're so far into server core counts perhaps 10K is what we should be paying for CPU, no?!.
Servers have RAM channels, PCIe, ECC, and are tuned for different jobs. Don't try and strawman my point please.
 
Really. So by saying that it's faster than the previous gen titan tier card while being cheaper in my first comment, then expanding on that in later comments i'm moving the goalposts, interesting.
Yes. Because nobody is arguing against your points. But those points are irrelevant to the argument at hand. Which is that NVIDIA GPU prices have risen unreasonably across generations.
 
Servers have RAM channels, PCIe, ECC, and are tuned for different jobs. Don't try and strawman my point please.
Strawman a strawmans argument and that's not on? Ha haaaa, whatever you got my point hence f all in reply.

I was clearly pointing to the CPU performance gains.

You talk like a shareholder.
 
Yes. Because nobody is arguing against your points. But those points are irrelevant to the argument at hand. Which is that NVIDIA GPU prices have risen unreasonably across generations.
If it's so unreasonable then I'm sure the market will reflect that in GPU marketshare. We'll see if anything changes there for NVIDIA, but I doubt it. When you're leading in performance, software support and features you can set the prices, and this isn't something that NVIDIA will ever stop doing, as they are a business. Would I like to see a $500 4080? Sure. Will it happen? No. Process nodes are only getting more expensive, and demand for GPUs isn't going to dry up.
 
The 3080 at £650 was faster than the fastest gaming tier Turing, the 2080ti at £1200.

The 4080 at £1200 is faster than the fastest Ampere card, the 3090ti, at £1600?

I suppose for some people that is reasonable. However, regardless of how people want to justify the price point - the sales and availability of the 4080 clearly invalidate that as a general consensus. And if sales are not good (as many outlets are suggesting), then the price point is wrong. It's that simple.
 
Every wafer NVIDIA makes can either go for $$$ in the datacentre/professional market, or $$ in the consumer market. If AMD had products that actually competed in the professional space maybe there wouldn't be so much incentive to focus on increasing margins in the consumer market to bring them in line with what NVIDIA can make per wafer elsewhere. And if AMD really wanted to start a price war they could have used all of those $$$ saving techniques boasted about to actually lower prices? As it is, the competition offers roughly the same basic raster performance, for slightly cheaper, as ever, but without the full stack of software and hardware support NVIDIA offers, so nothing will change.

The 3080 at £650 was faster than the fastest gaming tier Turing, the 2080ti at £1200.

The 4080 at £1200 is faster than the fastest Ampere card, the 3090ti, at £1600?

I suppose for some people that is reasonable. However, regardless of how people want to justify the price point - the sales and availability of the 4080 clearly invalidate that as a general consensus. And if sales are not good (as many outlets are suggesting), then the price point is wrong. It's that simple.
Yeah, that's probably why they're going to reduce the price, NVIDIA responds to consumer demands, just like the 4080 12 GB being pulled.

The 3080 at £650 was faster than the fastest gaming tier Turing, the 2080ti at £1200.
It was never actually £650 though. Even now with mining dead, used/new cards are about that.
 
Every wafer NVIDIA makes can either go for $$$ in the datacentre/professional market, or $$ in the consumer market. If AMD had products that actually competed in the professional space maybe there wouldn't be so much incentive to focus on increasing margins in the consumer market to bring them in line with what NVIDIA can make per wafer elsewhere. And if AMD really wanted to start a price war they could have used all of those $$$ saving techniques boasted about to actually lower prices? As it is, the competition offers roughly the same basic raster performance, for slightly cheaper, as ever, but without the full stack of software and hardware support NVIDIA offers, so nothing will change.
That is again besides the point. The point is that Nvidia increased prices at an unreasonable rate, and now that there's rumours of 4080 price cuts, we're supposed to applaud them for some reason.
Thanks, but no thanks.

Pointing fingers at AMD serves no purpose in this argument because 1. the argument has nothing to do with AMD, and 2. if you really want to include AMD then how about the 7900 XTX starting at the same MSRP as the 6900 XT did? Where's the price hike here? I don't see it. Oh and 3. What's this full software stack bullshit? The only thing AMD doesn't have an equivalent for is CUDA. If you absolutely can't live without it, fair enough. Other than that, AMD has everything that Nvidia does. Let's not even bring Nvidia's 1990s style control panel that doesn't let you tune your card into the conversation. ;)

Nvidia's superiority is based on smoke and mirrors and common misbeliefs.
 
That is again besides the point. The point is that Nvidia increased prices at an unreasonable rate, and now that there's rumours of 4080 price cuts, we're supposed to applaud them for some reason.
Thanks, but no thanks.

Pointing fingers at AMD serves no purpose in this argument because 1. the argument has nothing to do with AMD, and 2. if you really want to include AMD then how about the 7900 XTX starting at the same MSRP as the 6900 XT did? Where's the price hike here? I don't see it. Oh and 3. What's this full software stack bullshit? The only thing AMD doesn't have an equivalent for is CUDA. If you absolutely can't live without it, fair enough. Other than that, AMD has everything that Nvidia does. Let's not even bring Nvidia's 1990s style control panel that doesn't let you tune your card into the conversation. ;)
Talking about the only competition in a discussion about cost will never be beside the point. Competition sets cost, simple.

If AMD had something that was as good, for cheaper, everyone would use it, but they don't.
 
The 3080 at £650 was faster than the fastest gaming tier Turing, the 2080ti at £1200.

The 4080 at £1200 is faster than the fastest Ampere card, the 3090ti, at £1600?

I suppose for some people that is reasonable. However, regardless of how people want to justify the price point - the sales and availability of the 4080 clearly invalidate that as a general consensus. And if sales are not good (as many outlets are suggesting), then the price point is wrong. It's that simple.

I was reading an article on videocardz about the difficulty scalpers are having selling RTX 4080 cards so they had to start selling them at MSRP.

https://videocardz.com/newz/scalper...80-cards-now-graciously-offering-them-at-msrp
 
Every wafer NVIDIA makes can either go for $$$ in the datacentre/professional market, or $$ in the consumer market. If AMD had products that actually competed in the professional space maybe there wouldn't be so much incentive to focus on increasing margins in the consumer market to bring them in line with what NVIDIA can make per wafer elsewhere. And if AMD really wanted to start a price war they could have used all of those $$$ saving techniques boasted about to actually lower prices? As it is, the competition offers roughly the same basic raster performance, for slightly cheaper, as ever, but without the full stack of software and hardware support NVIDIA offers, so nothing will change.


Yeah, that's probably why they're going to reduce the price, NVIDIA responds to consumer demands, just like the 4080 12 GB being pulled.


It was never actually £650 though. Even now with mining dead, used/new cards are about that.
Way to go there your shares will thank you for the childish and factually blind bias.
AMD and Intel have pro compute GPU.
 
I'll make it simple for you: are you a dumbass who is incapable of performing the basic operation of "insert plug A into receptacle B and ensure that clip C is engaged"? If you are, then yes you should avoid the ATX12VHPWR connector... as well as literally every other PC power connector ever invented.
Last I checked, improperly securing any previous PC connector hasn’t resulted in catastrophic failure of the connection. Not in all my many many years on forums have I heard of such a thing. It can actually be a valid combination of user error and bad engineering. Even Apple got rightly panned for implying the user was holding a phone wrong, and subsequently offered mitigating solutions (bumper case) before properly engineering a phone antenna in the next generation. I honestly don’t know how overblown the melting connector issue is, but it certainly suggests there is a design flaw where a potentially incomplete connection or bent cable causes amperage so high it melts the connector before something else shuts down the system. It sure seems like something could be designed better, as even a crash to desktop or no-boot would be preferred, and THAT has been the norm for every previous connection that I can recall. “No boot? Random crashes? Check your connections.” has been sage advice for decades.
 
Way to go there your shares will thank you for the childish and factually blind bias.
AMD and Intel have pro compute GPU.
% marketshare? :)

I wonder if it's similar to consumer marketshare ;)
 
% marketshare? :)

I wonder if it's similar to consumer marketshare ;)
Too many like you in IT departments I'm sure.

And that goalpost eh you just can't see it the f£#@&r runs off, always moving.

You said "Every wafer NVIDIA makes can either go for $$$ in the datacentre/professional market, or $$ in the consumer market. If AMD had products that actually competed in the professional space maybe there wouldn't be so much incentive to focus on increasing margins in the consumer market to bring them in line with what NVIDIA can make per wafer elsewhere. And if AMD really wanted to start a price war they could have used all of those $$$ saving techniques boasted about to actually lower prices? As it is, the competition offers roughly the same basic raster performance, for slightly cheaper, as ever, but without the full stack of software and hardware support NVIDIA offers, so nothing will change."


Or in short strawmanned AMD

Bbbb but AMD.

Own your shit.
 
I'll make it simple for you: are you a dumbass who is incapable of performing the basic operation of "insert plug A into receptacle B and ensure that clip C is engaged"?
Yes, I am. One day I will figure out to to plug in the cube to the square hole properly. Until then, I'll be stick to the cylinder in the round hole. Thank you.
 
Talking about the only competition in a discussion about cost will never be beside the point. Competition sets cost, simple.

If AMD had something that was as good, for cheaper, everyone would use it, but they don't.
You know very well that the product being good is only one of the many factors that determine competition, and that marketing plays a much bigger role, don't you?

As an owner of both AMD and Nvidia GPUs, I know for a fact that both are equally good if the price is right.
 
? My logic is performance that used to cost $1600 now costs $1200, and will probably cost $999 pretty soon, while drawing 200 W less power.
For the past couple of GPU generations, every single xx-80 card has outperformed the last generation 80 Ti/90/ Titan equivalent card while staying at the normal MSRP level between $599-$699 ($799 MSRP for the 2080 but it then had a reduction to $699 i think)

The 3080 had an MSRP of $700 (On paper) and surpassed the $1200 2080 Ti by around 30% in relative performance and even the $2500 TITAN RTX by 14%, it came with an increase in power consumption however.

Now the 2080 was disappointing in terms of the performance uplift over the last generation 1080 Ti and Titan XP that were both sort of equivalent, it only offered around a 7-10% uplift and it reduced power consumption by around 10-15% only but it still had a sort of sane MSRP ($800 MSRP was bad but in comparison to the 70% price uplift that comes with the 4080, the 2080's original MSRP looks actually sane, then when the 2080 Super came out it was faster than the 2080 and had a reduction in MSRP to $699)

The 1080 had an MSRP of $599 ($699 for FE) and it outperformed the 980 Ti/Titan X (Maxwell) by 24-26% and reduced power consumption by around 26%-30%

There was no reason for Nvidia to raise the MSRP by 70% for the 4080. Your logic that it's fine for the 4080 to be priced at $1200 because it beats the 3090 Ti that was priced at $2000 does not work because all the previous generation 80 series cards outperformed their previous generations 80 Ti/90/Titan equivalent cards while reducing power consumption and staying around the same MSRP level (3080 is the only exception in terms of power consumption, efficiency stayed around the same level.) Also did you not see how fast the 3090 Ti $2000 MSRP fell apart around the time that Crypto crashed? There were some 3090 Ti models that were selling for around $1000-$1100 before stock ran out after the release of the 4090.

Even if the 4080 gets a price cut to $1000 it will still be a terrible price, it should drop to around $800-$850, we all know that Nvidia would never drop it down to $700.
 
Even if the 4080 gets a price cut to $1000 it will still be a terrible price, it should drop to around $800-$850, we all know that Nvidia would never drop it down to $700.
Why would they price it $200 lower than it's competition when they have no reason to (they own the marketshare) and $100 lower than the 7900XT?

To satisfy your sense of fairness?
 
Last edited:
Meh. Leave them double the price of the equivalent Radeon card. Fanboys will still buy them. Someone could use a new leather jacket for winter. ;)
Ain't that the truth. Depending on region you can pick up a 6700XT for the price of some RTX 3060s, yet the 6600 (non-XT) is a closer match for the 3060 at 40% lower cost.
Here in the UK we can't quite buy a 6700XT for the price of a 3060, but the vanilla 6700 10GB is around £20 cheaper than the 3060 12GB and it's a good 25% faster.

People will argue "bUt ThE rAyTrAcInG pErFoRmAnCe!!1" and I would like to point out that the 3060 can't raytrace for shit. In RTX-focused titles with an emphasis on raytracing its performance is pitiful even with DLSS. The 3060 raytraces badly enough that you'd only use it once out of morbid curiosity to see just how bad the framerate hit is. In DXR-light titles, as developed for the RDNA2 silicon in the PS5 and XBSX, even the pathetic raytracing of the 3050 or 6500XT is adequate.
 
Last edited:
That "Get better, Nvidia" is so wrong in so many ways. Why would Nvidia get better, when the consumer spots the huge price increase and instead of saying "Good buy Nvidia, I am going with the competition", they just beg Nvidia to "Get better"? If I was Nvidia, seeing that "Get better, Nvidia", I would be putting a $1499 price tag in the RTX 5080.

Then the problem is AMD not providing acceptable competition. They've handwaved away ray tracing performance for so long it's comedy now. They COULD have competed. They chose not to do so. If Nvidia figures everyone'll just pay whatever, AMD figures everyone'll just take whatever.
 
Back
Top