Monday, December 5th 2022

NVIDIA GeForce RTX 4080 Could Get a Price Cut to Better Compete with RDNA3

NVIDIA GeForce RTX 4080 graphics card has been out since mid-November and is a great performer in many resolutions and titles. However, with NVIDIA setting its price tag at $1200, it is an expensive product to afford and represents a considerable price jump compared to older xx80 GPU generations. According to MyDrivers, NVIDIA could lower the price starting in mid-December, to better suit the needs of consumers and have a competitive product. With AMD's RDNA3-based graphics cards releasing in the following days, the Radeon RX 7900 XTX costing $999 is a direct competitor to GeForce RTX 4080. If NVIDIA plans to cut the massive MSRP of the RTX 4080, then we expect it to be in the range of Radeon RX 7900 XTX to create better market competition.

Of course, this is only wishful thinking and a rumor that MyDrivers has reported, so we have to wait until the middle of this month to find out if NVIDIA announces the alleged price cut.
Source: MyDrivers (Chinese)
Add your own comment

118 Comments on NVIDIA GeForce RTX 4080 Could Get a Price Cut to Better Compete with RDNA3

#76
Chrispy_
HisDivineOrderThen the problem is AMD not providing acceptable competition. They've handwaved away ray tracing performance for so long it's comedy now. They COULD have competed. They chose not to do so. If Nvidia figures everyone'll just pay whatever, AMD figures everyone'll just take whatever.
Games are developed for consoles first, sometimes exclusively. Raytracing isn't cost-effective for console hardware, so the console hardware doesn't focus on it.

Even now, four years after my first RTX card purchase, I've yet to play a single game with raytracing and go "ooh wow, that looks so much better". Perhaps if I had a 4090 and was running at only 60Hz I'd turn it on for added visual flair, but for most of us mere mortals we don't have an overwhelming surplus of GPU performance and I, for one, would rather hit good framerates at native resolution without the latency and blur of DLSS.
Posted on Reply
#77
dgianstefani
TPU Proofreader
Ayhamb99Because the 7900 XTX is most probably going to be competing with the 4090 in terms of performance, say what you want about what AMD quoted, the specs sheets of the 7900 XTX and the performance increases that were talked about (Still going to wait until benchmarks come out instead of just believing what was said at the announcement) make it look like it is going to be directly competing more with the 4090 rather than the 4080. When the 4080 is so overpriced that it makes the 4090 at $1600 look like actual good value, you know that it desperately needs a massive cut in MSRP.

"To satisfy your sense of fairness" Sheesh dude, are you an Nvidia shareholder in disguise? The 4080 has been out for less than a month and they are already discussions of price cuts, that should tell you how badly this GPU launch went and how it should have been priced more sanely and fairly instead of just try to cash in on the morons who never listened to anyone and bought GPUs at the heavily inflated prices from scalpers last year.
So AMD compared it to the 4080 because it performs like a 4090.

Seems like smart marketing.

As you said waiting for release is probably a good idea, over indulging in wishful thinking.
Posted on Reply
#78
AusWolf
HisDivineOrderThen the problem is AMD not providing acceptable competition. They've handwaved away ray tracing performance for so long it's comedy now. They COULD have competed. They chose not to do so. If Nvidia figures everyone'll just pay whatever, AMD figures everyone'll just take whatever.
Because there is no gaming without ray tracing, obviously... (khm...) and AMD being slower at it means that's it's totally and utterly unusable with every AMD card at every resolution...

Sure, competition is good, but when you can buy a 6700 XT for the price of a 3060, or a 6800 for the price of the 3070, the ray tracing argument goes out of the window, don't you think?
Posted on Reply
#79
hsew
Hard to believe when looking at current Ampere prices…
Posted on Reply
#80
Minus Infinity
I voted no becuase Ngreedia don't deserve my money. even if $799 I am going AMD this gen. They need the crap kicked out of them like Intel did for treating like scum for so many years. Alas too many muppets will still reward Huang and they won't give a toss about a little market share loss, desktop gpu's is not where the money is anyway.
Posted on Reply
#81
GhostRyder
Nordic
Although I think the price should match the 580 adjusted for inflation, anything higher than $899 is simply unreasonable. $649 would be an ideal price point.
I like seeing a chart like this because it really shows how bad this jump is. But it also misses one other thing and that is the fact that the X80/XX80 series was moved down the product stack at a point. Originally the X80 series was the top of the line card from Nvidia (Save for the dual GPU cards that were the X90 series). But starting at the 700 series it slowly moved down the list with cards like the GTX 780ti and GTX Titan series. Now the XX80 series is more where the X70 series used to be and we have new names/nomenclatures for the top end ("Ti" variants and XX90 series single GPU cards). To me that's what makes this insulting more than anything as this card is the smaller GPU that has made a huge jump beyond is previous generation. Does not matter if it's better than the previous generations top end card, thats par for the course at this point (Heck the X70 series back in the day was at least very similar or higher performance compared to the previous top end).

Nvidia does not need to get better, people need to buy the competitors cards. That will actually send a message when sales go down and they are either forced to cut prices. Now we dont yet have the 7900 series performance numbers yet and it could suck which would mean were in trouble this round, but if it's very good and competitive everyone needs to vote with their wallet to help "Fix" Nvidia.
Posted on Reply
#82
Nordic
GhostRyderI like seeing a chart like this because it really shows how bad this jump is. But it also misses one other thing and that is the fact that the X80/XX80 series was moved down the product stack at a point. Originally the X80 series was the top of the line card from Nvidia (Save for the dual GPU cards that were the X90 series). But starting at the 700 series it slowly moved down the list with cards like the GTX 780ti and GTX Titan series. Now the XX80 series is more where the X70 series used to be and we have new names/nomenclatures for the top end ("Ti" variants and XX90 series single GPU cards). To me that's what makes this insulting more than anything as this card is the smaller GPU that has made a huge jump beyond is previous generation. Does not matter if it's better than the previous generations top end card, thats par for the course at this point (Heck the X70 series back in the day was at least very similar or higher performance compared to the previous top end).

Nvidia does not need to get better, people need to buy the competitors cards. That will actually send a message when sales go down and they are either forced to cut prices. Now we dont yet have the 7900 series performance numbers yet and it could suck which would mean were in trouble this round, but if it's very good and competitive everyone needs to vote with their wallet to help "Fix" Nvidia.
I am aware. The x80 card is now equivalent to the old x70 cards. Nvidia won't get better until people stop buying their products at these insane prices.
Posted on Reply
#83
AusWolf
Minus InfinityI voted no becuase Ngreedia don't deserve my money. even if $799 I am going AMD this gen. They need the crap kicked out of them like Intel did for treating like scum for so many years. Alas too many muppets will still reward Huang and they won't give a toss about a little market share loss, desktop gpu's is not where the money is anyway.
Still, at least Intel offered the same 4-core i7s for relatively the same price all those years. They offered them as replacement for people with older systems, and not the holy grail like Nvidia wants you to think about their products nowadays. Nvidia literally threw their price/performance ratio out of the window with the 40-series. They don't just want people to pay for the same performance and service. They want us to pay more, while only offering us the pathetic excuse that "it's faster than last gen". Guess what, Nvidia, everything has been faster than last gen ever since the computer was invented. That's not an excuse to increase prices so drastically over such a short time.

I think what people need to do is not buy Nvidia, or buy AMD, but to stop for a second and think if they actually need an upgrade, and if they do, choose the product that fits their need and budget without giving in to hype or marketing. Refusing to be mindless consumers is the best we could ever do. I know it's too much to ask in a society where throwing money at the first shiny thing that comes around is the norm, but my opinion still holds.
Posted on Reply
#84
Minus Infinity
AusWolfStill, at least Intel offered the same 4-core i7s for relatively the same price all those years. They offered them as replacement for people with older systems, and not the holy grail like Nvidia wants you to think about their products nowadays. Nvidia literally threw their price/performance ratio out of the window with the 40-series. They don't just want people to pay for the same performance and service. They want us to pay more, while only offering us the pathetic excuse that "it's faster than last gen". Guess what, Nvidia, everything has been faster than last gen ever since the computer was invented. That's not an excuse to increase prices so drastically over such a short time.

I think what people need to do is not buy Nvidia, or buy AMD, but to stop for a second and think if they actually need an upgrade, and if they do, choose the product that fits their need and budget without giving in to hype or marketing. Refusing to be mindless consumers is the best we could ever do. I know it's too much to ask in a society where throwing money at the first shiny thing that comes around is the norm, but my opinion still holds.
Well I usually only upgrade a pc every 6-8 years, so I at least skip 2generations of gpu. I'm going to upgrade this time from 1080 Ti, as I finally want to game in 4K. 7900XT(X) will last me at least until RDNA6. Rasterisation is already good enough. Let's see what happens with RT in 2 generations.
Posted on Reply
#85
AusWolf
Minus InfinityWell I usually only upgrade a pc every 6-8 years, so I at least skip 2generations of gpu. I'm going to upgrade this time from 1080 Ti, as I finally want to game in 4K. 7900XT(X) will last me at least until RDNA6. Rasterisation is already good enough. Let's see what happens with RT in 2 generations.
That's the best thing to do nowadays, imo. :)

I tend to buy lots of hardware just out of curiosity, but I sell or repurpose old stuff as secondary systems. I'll have to stop doing that now, as hardware is getting too expensive just to play around with, and I've already got parts for several gaming-grade PCs lying around. I'm happy with my 6750 XT, and I've got a 2070 that will be perfect to transform my HTPC into a mini gaming rig. These will serve me well for years to come.
Posted on Reply
#86
whereismymind
dgianstefaniIt's faster than a 3090ti so at 1000 or 1100 it's reasonable.
and the 1070 was faster than a 980 Ti so it should've costed more.
Posted on Reply
#87
64K
There's an article on videocardz saying that Nvidia has lowered the price on the 4080 and 4090 by 5% in Europe but I don't think that's going to be nearly enough. Even with the 5% reduction in price the prices are still way too high.
Posted on Reply
#88
pavle
By the looks of the situation, Radeon cards aren't going to be "cheap" as in at their MSRP, so no problem. By the way today I saw report on Geekbench (Vulkan) scores of RX 7900 XTX vs RTX 4080 and while it's faster 15%, it totally falls on its face in OpenCL and to think AMD was one of the originators of OpenCL, what a shame. Where is GCN when you need it? :D

Edit: correction of the test name...
Posted on Reply
#89
AusWolf
pavleBy the looks of the situation, Radeon cards aren't going to be "cheap" as in at their MSRP, so no problem. By the way today I saw report on 3DMurk scores of RX 7900 XTX vs RTX 4080 and while it's faster 15%, it totally falls on its face in OpenCL and to think AMD was one of the originators of OpenCL, what a shame. Where is GCN when you need it? :D
The real question is: do you really need it? (I mean, OpenCL)
Posted on Reply
#90
pavle
Probably not, but hey, if they tout all kind of API support at least they should beat the competition at all of them for consistency, you know. :)
Posted on Reply
#91
TheoneandonlyMrK
pavleProbably not, but hey, if they tout all kind of API support at least they should beat the competition at all of them for consistency, you know. :)
That's unrealistic, no one uses 64bit compute, they castrated it, both companies, because old unused tech is irrelevant to gaming, the main use case of modern GPU.

While Open Cl is still used it's not used much in games.
Posted on Reply
#93
Assimilator
Darmok N JaladLast I checked, improperly securing any previous PC connector hasn’t resulted in catastrophic failure of the connection. Not in all my many many years on forums have I heard of such a thing. It can actually be a valid combination of user error and bad engineering. Even Apple got rightly panned for implying the user was holding a phone wrong, and subsequently offered mitigating solutions (bumper case) before properly engineering a phone antenna in the next generation. I honestly don’t know how overblown the melting connector issue is, but it certainly suggests there is a design flaw where a potentially incomplete connection or bent cable causes amperage so high it melts the connector before something else shuts down the system. It sure seems like something could be designed better, as even a crash to desktop or no-boot would be preferred, and THAT has been the norm for every previous connection that I can recall. “No boot? Random crashes? Check your connections.” has been sage advice for decades.
No previous connector has been transferring 600W in such a small envelope, though. And therein lies the problem: not in the design of the connector, but in the fact that PC users are mostly ignorant of how electricity works in their system. Nowadays there are extremely high currents and voltages present, yet most treat PC parts as Lego bricks that can just be slapdashed together. Ask any competent qualified electrician how well that works and whether they'd get away with it in their job, and you'll probably get a look of horror - yet y'all are sitting here trying to convince me that it's not Joe Average's fault when they slapdash the power connections in their own PC? Sorry, no.

I can guarantee you that there are millions of people running their PCs with improperly seated motherboard and/or PCIe power connectors who have been saved from electrical fires simply due to the lower current and/or voltage of these connector types. Every single one of those is a fire waiting to happen. Does that mean we should declare every and all of those connector designs "flawed" too? No, it just means that those users are lucky idiots.

Now of course you're going to say "but all connectors should be idiot-proof" and you'll be wrong, because of one of Murphy's laws: "build something idiot-proof, and they will build a better idiot". There's no way to build something that someone, somewhere, somehow, won't f**k up in the most unimaginably stupid way possible - this is why cans have warnings on them about sharp edges.

So what do we do? Do we give up on technology and progress and move to Alaska and live with the Inuit? No, we build better connectors and accept that we're going to have more problems with idiots.

A PC is an electrical device; treat it as such. Treat it with the respect and understanding that you'd treat anything potentially capable of killing you. Take responsibility and be careful and diligent. It really isn't that difficult.

And sod off with the stupid non-comparison to iPhones. An aerial being blocked by a user's hand is nowhere near as problematic as a fire or electrical hazard.
Posted on Reply
#94
TheoneandonlyMrK
AssimilatorNo previous connector has been transferring 600W in such a small envelope, though. And therein lies the problem: not in the design of the connector, but in the fact that PC users are mostly ignorant of how electricity works in their system. Nowadays there are extremely high currents and voltages present, yet most treat PC parts as Lego bricks that can just be slapdashed together. Ask any competent qualified electrician how well that works and whether they'd get away with it in their job, and you'll probably get a look of horror - yet y'all are sitting here trying to convince me that it's not Joe Average's fault when they slapdash the power connections in their own PC? Sorry, no.

I can guarantee you that there are millions of people running their PCs with improperly seated motherboard and/or PCIe power connectors who have been saved from electrical fires simply due to the lower current and/or voltage of these connector types. Every single one of those is a fire waiting to happen. Does that mean we should declare every and all of those connector designs "flawed" too? No, it just means that those users are lucky idiots.

Now of course you're going to say "but all connectors should be idiot-proof" and you'll be wrong, because of one of Murphy's laws: "build something idiot-proof, and they will build a better idiot". There's no way to build something that someone, somewhere, somehow, won't f**k up in the most unimaginably stupid way possible - this is why cans have warnings on them about sharp edges.

So what do we do? Do we give up on technology and progress and move to Alaska and live with the Inuit? No, we build better connectors and accept that we're going to have more problems with idiots.

A PC is an electrical device; treat it as such. Treat it with the respect and understanding that you'd treat anything potentially capable of killing you. Take responsibility and be careful and diligent. It really isn't that difficult.

And sod off with the stupid non-comparison to iPhones. An aerial being blocked by a user's hand is nowhere near as problematic as a fire or electrical hazard.
The PCI group disagree with your Stance and Nvidia.
A connector should be designed such that failure doesn't equal fire.

And as others have said this hasn't been the case with older connection standards, it's a poor design and defending it is a meritless task but you do you.

Your extreme Alaskan stance has no Merritt we have a answer multiple 8 pins OR make a better 12 pin.
Posted on Reply
#95
AusWolf
TheoneandonlyMrKThe PCI group disagree with your Stance and Nvidia.
A connector should be designed such that failure doesn't equal fire.

And as others have said this hasn't been the case with older connection standards, it's a poor design and defending it is a meritless task but you do you.

Your extreme Alaskan stance has no Merritt we have a answer multiple 8 pins OR make a better 12 pin.
There is always a failure rate - the questions are, how big, how easy it is to reproduce, and how much media attention it gets.
AssimilatorA PC is an electrical device; treat it as such. Treat it with the respect and understanding that you'd treat anything potentially capable of killing you. Take responsibility and be careful and diligent. It really isn't that difficult.
While I don't think the 12-pin connector is a sound design, this point of yours is very well said! I'd also add: treat your devices with respect, care for them the way they care for you.
Posted on Reply
#96
Assimilator
TheoneandonlyMrKThe PCI group disagree with your Stance and Nvidia.
No they don't. The PCI-SIG's statement was nothing more than a CYA exercise to distance themselves from the lawsuit filed against NVIDIA.
TheoneandonlyMrKA connector should be designed such that failure doesn't equal fire.
You should tell that to the laws of physics.
TheoneandonlyMrKAnd as others have said this hasn't been the case with older connection standards
Did you even read my post? It hasn't happened because those connectors simply haven't been transferring the current/voltage the ATX12VHPR connector does. That higher current/voltage does not make that connector unsafe or a flawed design in any way shape or form.
TheoneandonlyMrKwe have a answer multiple 8 pins OR make a better 12 pin.
To provide the same amount of power as a 16-wire/pin ATX12VHPWR connector requires four 8-pin PCIe power connectors, which is FOUR TIMES the number of physical connectors and receptacles, and DOUBLE the number of wires and pins and pads. (Arguably the ATX12VHPR sense wires/pins/pads could be omitted from this comparison as the voltages they carry are extremely low, which makes the PCIe connector solution look even worse.) Not only is that more expensive, not only does it take up more space, it's arguably less safe because it has more wires to route and more points to contact, and thus more potential points of failure (especially, it means you have to ensure that all four PCIe power connectors are properly seated).

8-pin PCIe power connectors were a good solution when you needed one or maybe two of them. They are not a good solution when you need three or more of them. Until a solution to the end of Moore's Law is found, GPU power consumption is going to get worse before it gets better, and that is ample justification for a new single connector that is able to service that power consumption, as opposed to just adding more PCIe 8-pins until you run out of space on the edge of the board (only being partially facetious here).

As for "making a better 12-pin", do you really think that the PCI-SIG - including titans like Intel - would have ratified the current ATX12VHPWR connector for inclusion into multiple standards if they had any concerns at all about its suitability?
AusWolfThere is always a failure rate - the questions are, how big, how easy it is to reproduce, and how much media attention it gets.
Exactly. The only reason this one got so much attention is because of the hate towards NVIDIA for the 4000-series pricing. And while it's fine to be upset at that pricing, it's not fine to blanket blame and smear the company just because you don't like that pricing. That's immature and petty and doesn't help anything or anyone.
Posted on Reply
#97
TheoneandonlyMrK
AssimilatorNo they don't. The PCI-SIG's statement was nothing more than a CYA exercise to distance themselves from the lawsuit filed against NVIDIA.


You should tell that to the laws of physics.


Did you even read my post? It hasn't happened because those connectors simply haven't been transferring the current/voltage the ATX12VHPR connector does. That higher current/voltage does not make that connector unsafe or a flawed design in any way shape or form.


To provide the same amount of power as a 16-wire/pin ATX12VHPWR connector requires four 8-pin PCIe power connectors, which is FOUR TIMES the number of physical connectors and receptacles, and DOUBLE the number of wires and pins and pads. (Arguably the ATX12VHPR sense wires/pins/pads could be omitted from this comparison as the voltages they carry are extremely low, which makes the PCIe connector solution look even worse.) Not only is that more expensive, not only does it take up more space, it's arguably less safe because it has more wires to route and more points to contact, and thus more potential points of failure (especially, it means you have to ensure that all four PCIe power connectors are properly seated).

8-pin PCIe power connectors were a good solution when you needed one or maybe two of them. They are not a good solution when you need three or more of them. Until a solution to the end of Moore's Law is found, GPU power consumption is going to get worse before it gets better, and that is ample justification for a new single connector that is able to service that power consumption, as opposed to just adding more PCIe 8-pins until you run out of space on the edge of the board (only being partially facetious here).

As for "making a better 12-pin", do you really think that the PCI-SIG - including titans like Intel - would have ratified the current ATX12VHPWR connector for inclusion into multiple standards if they had any concerns at all about its suitability?


Exactly. The only reason this one got so much attention is because of the hate towards NVIDIA for the 4000-series pricing. And while it's fine to be upset at that pricing, it's not fine to blanket blame and smear the company just because you don't like that pricing. That's immature and petty and doesn't help anything or anyone.
Clearly your inner apologiest has the opinion Nvidia are right and wouldn't lie, they and GN said it's user error, so it is.

I disagree with most of your statement but don't care enough to argue all day with you.

Your sounding like a shill so bye.
Posted on Reply
#98
Darmok N Jalad
AssimilatorNo previous connector has been transferring 600W in such a small envelope, though. And therein lies the problem: not in the design of the connector, but in the fact that PC users are mostly ignorant of how electricity works in their system. Nowadays there are extremely high currents and voltages present, yet most treat PC parts as Lego bricks that can just be slapdashed together. Ask any competent qualified electrician how well that works and whether they'd get away with it in their job, and you'll probably get a look of horror - yet y'all are sitting here trying to convince me that it's not Joe Average's fault when they slapdash the power connections in their own PC? Sorry, no.

I can guarantee you that there are millions of people running their PCs with improperly seated motherboard and/or PCIe power connectors who have been saved from electrical fires simply due to the lower current and/or voltage of these connector types. Every single one of those is a fire waiting to happen. Does that mean we should declare every and all of those connector designs "flawed" too? No, it just means that those users are lucky idiots.

Now of course you're going to say "but all connectors should be idiot-proof" and you'll be wrong, because of one of Murphy's laws: "build something idiot-proof, and they will build a better idiot". There's no way to build something that someone, somewhere, somehow, won't f**k up in the most unimaginably stupid way possible - this is why cans have warnings on them about sharp edges.

So what do we do? Do we give up on technology and progress and move to Alaska and live with the Inuit? No, we build better connectors and accept that we're going to have more problems with idiots.

A PC is an electrical device; treat it as such. Treat it with the respect and understanding that you'd treat anything potentially capable of killing you. Take responsibility and be careful and diligent. It really isn't that difficult.

And sod off with the stupid non-comparison to iPhones. An aerial being blocked by a user's hand is nowhere near as problematic as a fire or electrical hazard.
No, we don't give up on technology, but like many things, we learn from the failures and revisit the design. That's what I'm saying. I didn't solely blame the user or the engineer, but you better bet the engineers are looking at the design and going back to the drawing board. After all, they are taking back the failed cards to study. It's not an either/or situation. Of course we build better connectors, but history is full of designs that should have worked but didn't, and the technology that follows is better for it. We add things called safety features, and electrical systems are full of them--even the system that delivers power all the way from the power plant to your home. When's the last time you melted a wall outlet? You haven't, because your breaker box would trip when you exceeded the design of the connector and the wires behind the wall (if your breaker box was designed and installed correctly). Those breakers cost a few dollars, and countless numbers of people have been spared damage to life, limb, and personal property because of them. Part of engineering is saying "can we do it better?"

And what's with the attacks? Did I insult you personally in my first comment? It's just a computer connector.
Posted on Reply
#99
john_
HisDivineOrderThen the problem is AMD not providing acceptable competition. They've handwaved away ray tracing performance for so long it's comedy now. They COULD have competed. They chose not to do so. If Nvidia figures everyone'll just pay whatever, AMD figures everyone'll just take whatever.
Well, you could be right. But then again at times, people seems to ask for the more shiny sticker. They ask AMD to provide "acceptable competition", only to buy cheaper Intel and Nvidia hardware. Even when AMD's hardware is as good or even better for their needs and wallet.
Posted on Reply
#100
TheoneandonlyMrK
john_Well, you could be right. But then again at times, people seems to ask for the more shiny sticker. They ask AMD to provide "acceptable competition", only to buy cheaper Intel and Nvidia hardware. Even when AMD's hardware is as good or even better for their needs and wallet.
Plus people idolise RT performance but it is completely shit in portal on the best Nvidia has to offer as well, dlss or something is needed to make it useable on portal and quake.

Few modern games use RT thousands DON'T and those that do either partially use it run poorly or need help to get acceptable frame rates.

For a large number of owners/gamer's it's just not important at all and.

Only leet hardware owners even care, IE a niche of our niche give a shit.
Posted on Reply
Add your own comment
Jun 3rd, 2024 09:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts