Sunday, January 14th 2024

GeForce RTX 40 SUPER Custom Model Pricing Leaks Out

NVIDIA revealed basic price guides for GeForce RTX-40 SUPER graphics cards at the recently concluded CES 2024 trade show, but their board partners largely stayed coy about figures for customized options (review embargoes will be lifted soon). ZOTAC broke the mold later on in the week, with press material updated to reflect that non-overclocked models will adhere to Team Green's basic MSRP. However, premium charges for overclocked SUPER Twin Edge OC, AMP HOLO, Trinity OC and AMP Extreme AIRO cards remain a mystery. VideoCardz decided to conduct some weekend detective work, and fiddled around on Newegg and Best Buy online stores—although the focus shifted to other brands/manufacturers.

Workaround methods were implemented in order to prematurely extract card prices, before NVIDIA's staggered schedule of reveals for customized versions of the GeForce RTX 4070 SUPER, RTX 4070 Ti SUPER and RTX 4080 SUPER GPUs (throughout January). The leaked results show that GIGABYTE and PNY have custom overclocked GeForce RTX 4070 SUPER models targeting base MSRP at $599, while MSI has several options exceeding that base level—ranging from $10 to $50 premiums. GIGABYTE's GAMING OC card also tops the table at $649. Jumping up to the GeForce RTX 4070 Ti SUPER tier, we see a GIGABYTE Gaming OC model sitting at $849.99 and an MSI VENTUS 3X OC going for $899.99. The sole custom GeForce RTX 4080 SUPER within the VideoCardz article appears to be an MSI VENTUS 3X OC; we are witnessing a $100 extra tacked on for this design.

VideoCardz has kindly compiled their findings into list form:

NVIDIA GeForce RTX 4070 SUPER 12 GB MSRP: $599
  • NVIDIA Founders Edition: $599.99
  • GIGABYTE WindForce OC: $599.99
  • PNY VERTO OC: $599.99
  • ZOTAC Twin Edge: $599.99
  • MSI VENTUS 2X OC: $609.99
  • MSI VENTUS 2X OC WHITE: $619.99
  • MSI VENTUS 3X OC: $629.99
  • MSI Gaming X Slim: $649.99
  • GIGABYTE GAMING OC: $649.99
NVIDIA GeForce RTX 4070 Ti SUPER 16 GB MSRP: $799
  • GIGABYTE Gaming OC: $849.99
  • MSI VENTUS 3X OC: $899.99
NVIDIA GeForce RTX 4080 SUPER 16 GB MSRP: $999
  • NVIDIA Founders Edition: $999.99
  • MSI VENTUS 3X OC: $1099.99
Source: VideoCardz
Add your own comment

86 Comments on GeForce RTX 40 SUPER Custom Model Pricing Leaks Out

#51
Denver
lasThis is exactly why. Making dGPUs for AMD just eat away at their output at TSMC and they earn more from CPUs and APUs.

GPU development and production is still important for AMD, but they don't need to have high-end offerings for what they do; iGPUs, APUs (including Console APUs) etc. It is simply not important for them. They officially said this, when they said 7900XTX was 4080 counter and not a 4090 counter. They left the entuisiast market.

High-end dGPU is a niche market for AMD and probably always will be. Name one high-end AMD offering that sold well in recent years?
AMD have always sold mostly low to mid-end GPUs. Research and development is very costly and high-end GPUs makes little sense for AMD. This is why they want to go MCM so they can scale their offerings much easier, without ramping up costs like crazy for high-end.


I bet AMDs goal with Radeon 8000 is just to use 5nm still, while using 3nm for CPUs as fast as possible (when Apple is done with it) - Zen 5. Cheap GPUs with good enough raster perf is what is needed to drive AMDs GPU marketshare forward again. Along with FSR and AFMF improvements.

AMD can't afford to go 3nm too soon. Too costly. Nvidia will be able to. Going from 4/5nm to 3nm for Nvidia will also mean price increases on their own. Probably around 50% more per wafer.

However Nvidia rules the gaming GPU market while not even focussing on the market. They have full focus on AI and Enterprise and this won't change for years. They even scaled back gaming GPU production to make AI/Enterprise chips. I think we won't see a flood of 4000 SUPER cards on release because of this.


TSMC increased production costs alot over the last years + Inflation. This is not only Nvidia increasing prices. Look at AMD prices today as well. They are generally not cheap, mostly because TSMC wants their cut. TSMC knows AMD relies 100% on TSMC. Remember how poorly Ryzan was prior to TSMC? GloFo 12nm was trash compared to even Intel 14nm.

In a few years, Intel is probably back in the lead with 20A/18A and will be open for business. I don't think TSMC can retain their lead for much longer. Maybe AMD can use Intel for their chips then :laugh:

But yeah, process improvement + inflation + shipping and higher development and production costs is what is driving up prices. This is true in all markets really. Expect hardware to get more and more expensive, especially in the high-end.

I predict RTX 5090 to be 1999 but I would not be surprised if its 2499. AMD has nothing to counter it. Just like 4090. AMD barely could counter 3090/3090 Ti even tho Nvidia used a cheap and mediocre process node in Samsung 8nm thats closer to 10nm TSMC in reality and yet Nvidia still won. Superior architecture is the reason.

AMD probably paid twice as much per 7nm wafer compared to Nvidia using Samsung 8nm, if not more. Nvidia did not need the best node to beat AMD.
You're right, the last one to compete for the top was the 6950XT, it managed to tie or beat the 3090 and even the 3090ti, it consumed less, it also cost a lot less, but the 3090/3090ti still sold many times more.

I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
Posted on Reply
#52
las
DavenBefore the Nvidia cult we had decent pricing.

280 $430
480 $500
580 $500
680 $500
780 $500
980 $550
1080 $600 - cult forms
2080 $700
3080 $800 (8960 CUDA version)
4080 $1200

The media didn’t do us any favors either by not dispelling the AMD bad driver myth that was perpetuated by non-AMD card owners trying to elevate Nvidia.
Yeah 7900XTX was priced at 999 while having higher power draw and inferior features than 4080. Like all AMD hardware tho, price drops over time. Nvidia keeps their pricing like Apple stuff. Meaning second hand prices are better. Sold my 3090 for 1000 dollars shortly before picking up a 4090 on launch.

The main reason prices went up are inflation, production costs, shipping costs and TSMC demanding a higher and higher cut.

If AMD wants to be the good guy, they can drop 7900XTX to 799 and 7900XTX to 599, they would probably loose money by doing that tho. They don't care about GPUs much.

With 4080 SUPER coming in at 1000, 7900XTX should go sub 800 ASAP. Might even be too much as long as features are highly inferior.
Posted on Reply
#53
Assimilator
DavenThe media didn’t do us any favors either by not dispelling the AMD bad driver myth that was perpetuated by non-AMD card owners trying to elevate Nvidia.
Such a myth that W1zz called it out in his review and there is a 20-page thread on these forums titled "How many of you Radeon 5700 owners have ditched your cards over the drivers"; similar threads can be found all over the internet.
Such a myth that Radeon drivers' idle and multi-monitor power consumption have been consistently broken on every new GPU release since Vega, or four generations. Read any of W1zz's launch day reviews of these cards.

Take your historical revisionism and shove it.
Posted on Reply
#54
las
DenverYou're right, the last one to compete for the top was the 6950XT, it managed to tie or beat the 3090 and even the 3090ti, it consumed less, it also cost a lot less, but the 3090/3090ti still sold many times more.

I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
I actually had a 6900XT with OC that performed just like 6950XT OC'ed. Remember its the exact same chip. Not the first time AMD refreshes an identical chip with slightly higher clocks.

www.techpowerup.com/review/asrock-radeon-rx-7900-xt-phantom-gaming-white/31.html
6900XT is not close to 3090 Ti really. 6950XT did not beat 3090 Ti overall and mostly delivered 3090 performance, if you looked at overall performance that is.

www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/30.html

6950XT performed like 5-6% better than 6900XT stock for stock.

www.techpowerup.com/review/asrock-radeon-rx-7900-xt-phantom-gaming-white/37.html
However the big problem with 6800 and 6900 series was power spiking, look at 20ms spikes, this is what destroys PSUs over time or make the systems reboot. AMD fixed this with 7000 series.

However 3090 Ti was crazy peaked and used way too much power. One of the worst GPU purchases in recent memory, because 4090 landed like 6 months later with almost twice the performance at much much lower power usage with a 1599 price tag instead of 1999 dollars.
Posted on Reply
#55
Daven
DenverYou're right, the last one to compete for the top was the 6950XT, it managed to tie or beat the 3090 and even the 3090ti, it consumed less, it also cost a lot less, but the 3090/3090ti still sold many times more.

I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
Yeah and the 6950xt was released a long, long time…oh wait. That was a year and a half ago. Lol!

I think Las doesn’t realize that AMD ‘skipped’ the high end occasionally (Sea Island, Polaris, RDNA1) but also competed for the high end for the most part. AMD could ‘skip’ the high end again for next gen but market forces determine business and rarely does someone leave a market segment. But It does happen. Intel is facing that rare tough choice currently.
AssimilatorSuch a myth that W1zz called it out in his review and there is a 20-page thread on these forums titled "How many of you Radeon 5700 owners have ditched your cards over the drivers"; similar threads can be found all over the internet.
Such a myth that Radeon drivers' idle and multi-monitor power consumption have been consistently broken on every new GPU release since Vega, or four generations. Read any of W1zz's launch day reviews of these cards.

Take your historical revisionism and shove it.
You cannot win this argument as a simple internet search conforms that no company is immune to driver problems and no company is ahead when it comes to good/bad drivers. Here is one such forum among many:

computers/comments/1761ayr

Drivers have bugs. Its always been that way. Now what some people are confusing are driver features such as super sampling. Some companies have better features implemented in a superior way. Nvidia definitely has better features. But that doesn’t mean AMD drivers are bad.
Posted on Reply
#56
Denver
lasI actually had a 6900XT with OC that performed just like 6950XT OC'ed. Remember its the exact same chip. Not the first time AMD refreshes an identical chip with slightly higher clocks.

www.techpowerup.com/review/asrock-radeon-rx-7900-xt-phantom-gaming-white/31.html
6900XT is not close to 3090 Ti really. 6950XT did not beat 3090 Ti overall and mostly delivered 3090 performance, if you looked at overall performance that is.

www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/30.html

6950XT performed like 5-6% better than 6900XT stock for stock.

www.techpowerup.com/review/asrock-radeon-rx-7900-xt-phantom-gaming-white/37.html
However the big problem with 6800 and 6900 series was power spiking, look at 20ms spikes, this is what destroys PSUs over time or make the systems reboot. AMD fixed this with 7000 series.

However 3090 Ti was crazy peaked and used way too much power. One of the worst GPU purchases in recent memory, because 4090 landed like 6 months later with almost twice the performance at much much lower power usage with a 1599 price tag instead of 1999 dollars.



It's the same design, but the 6950XT chip had better binning. 4-6% difference, some reviewers consider it a 5% margin of error, and depending on the game selection it could win or lose the battle for the top. But the main point is that the 3090ti cost $2000 and still sold much more than the 6950XT, Nvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.
Posted on Reply
#57
las
Denver


It's the same design, but the 6950XT chip had better binning. 4-6% difference, some reviewers consider it a 5% margin of error, and depending on the game selection it could win or lose the battle for the top. But the main point is that the 3090ti cost $2000 and still sold much more than the 6950XT, Nvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.
We can all cherry pick reviews but 3090 Ti custom cards generally beat 6950XT custom cards and the power spiking on 6950XT was still a big problem and only got worse, some 6950XT spiked to 650-700 watts

6950XT was nothing but an overclocked 6900XT since chip is 100% identical


Nvidia did not have drivers killing any GPUs, they were defective from the beginning and AMD cards died too from Diablo 4 and other games. More Nvidia cards died because more people use Nvidia.

AMD gimped tons of GPUs with PCIe lanes as well.

AMD released tons of bad GPUs, just as Nvidia, especially in the lower end segment ->www.techspot.com/review/2398-amd-radeon-6500-xt/
Posted on Reply
#58
Dr. Dro
DenverNvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.
If the only dirt you've got is a decade old GPU's "misfortune" of being relatively inefficiently designed, or a "killer driver" released even prior to that, it's little wonder that they've amassed a mindshare equivalent to Apple. Personally I've been bitten by the AMD bugs more times than I could count. I vehemently disagree with most of their business directives of late. I gave them a timeout. I still very much love AMD, but I just need some time off - like any heated relationship I suppose.
Posted on Reply
#59
Assimilator
DavenYou cannot win this argument as a simple internet search conforms that no company is immune to driver problems and no company is ahead when it comes to good/bad drivers. Here is one such forum among many:

computers/comments/1761ayr

Drivers have bugs. Its always been that way. Now what some people are confusing are driver features such as super sampling. Some companies have better features implemented in a superior way. Nvidia definitely has better features. But that doesn’t mean AMD drivers are bad.
You were the one who made the claim that AMD drivers aren't bad, I provided irrefutable evidence that they are. Trying to change the subject to "oh look NVIDIA drivers are also bad" is irrelevant whataboutism that doesn't detract from the proven untruth of your claims.
Dr. DroIf the only dirt you've got is a decade old GPU's "misfortune" of being relatively inefficiently designed, or a "killer driver" released even prior to that, it's little wonder that they've amassed a mindshare equivalent to Apple. Personally I've been bitten by the AMD bugs more times than I could count. I vehemently disagree with most of their business directives of late. I gave them a timeout. I still very much love AMD, but I just need some time off - like any heated relationship I suppose.
The "3.5GB VRAM" thing is a guaranteed way to detect someone who's run out of actual points to criticize NVIDIA on, and knows it.
Posted on Reply
#60
Dragam1337
AssimilatorMSI are smoking their socks if they think that people are going to pay a 10% premium over the Founders Edition of the already-expensive 4080S.
You shouldn't be pointing the finger at MSI... it's nvidia screwing over their AIB "partners".

But the founder edition is hardly available outside 'murica anyways, so for most of the world, the cheapest partner cards are the real price.
Posted on Reply
#61
Chrispy_
Lew ZealandOof +$100 over MSRP for a Ventus? I paid +$10 for a Ventus 1660 Super and it doesn't even have fan stop or power limit increase though it's a competent card with good thermals. The 4070 Ventus (also no Power Limit increase) is listed for MSRP so +$100 is crazy.
That *is* crazy.

Ventus is the base model;
  • Single BIOS,
  • Basic VRM with the minimum acceptable VRM design for stock speeds.
  • Cheapest possible cooler they can make with minimum acceptable heatpipe count and only the essential PCB components contacted.
  • No RGBLED
  • The lowest-clocked model in MSI's range and usually within 1-2% of Nvidia's reference board base clocks.
  • Default power limits with no headroom.
  • Shorter warranties than their premium models (in territories where it's legal to offer shorter warranties).
...and now, to make things worse than just the stupid price hike, the Ventus now comes with the shitty 12VHPWR connector that's riddled with issues, so there's NO REASON to buy it over the FE.
Posted on Reply
#62
Dr. Dro
Chrispy_That *is* crazy.

Ventus is the base model;
  • Single BIOS,
  • Basic VRM with the minimum acceptable VRM design for stock speeds.
  • Cheapest possible cooler they can make with minimum acceptable heatpipe count and only the essential PCB components contacted.
  • No RGBLED
  • The lowest-clocked model in MSI's range and usually within 1-2% of Nvidia's reference board base clocks.
  • Default power limits with no headroom.
  • Shorter warranties than their premium models (in territories where it's legal to offer shorter warranties).
...and now, to make things worse than just the stupid price hike, the Ventus now comes with the shitty 12VHPWR connector that's riddled with issues, so there's NO REASON to buy it over the FE.
The cards have been updated with 2x6 connectors that have shorter sense pins. They are not hazardous. The reason you have to buy it over FE is... that it's available and the FE isn't. Unless you're in the privileged markets Nvidia bothers to sell them at.
Posted on Reply
#63
bug
DavenI think Las doesn’t realize that AMD ‘skipped’ the high end occasionally (Sea Island, Polaris, RDNA1) but also competed for the high end for the most part. AMD could ‘skip’ the high end again for next gen but market forces determine business and rarely does someone leave a market segment. But It does happen. Intel is facing that rare tough choice currently.
Just like sour grapes, when AMD falls behind, they claim they're not interested in building big chips or competing at the high end anymore. When they launch a new architecture, they promptly forget about that. Rinse and repeat.

I have nothing personal against AMD (quite the opposite). It's just that their GPUs have been trailing Nvidia for too long. And that's the lipstick their marketing puts on the pig.
Posted on Reply
#64
R0H1T
BwazeAnd that was before the new Jensen's law, where you get worse price / performance with new releases, not better.
Tbf to Nvidia, although only on a technicality, the "artificial" pent up demand has come from people wanting to make money from thin air or Cat videos :laugh:
Dr. Drolike any heated relationship I suppose.
Yes if only this was a one off episode of love(?) island :shadedshu:
Posted on Reply
#65
Denver
lasWe can all cherry pick reviews but 3090 Ti custom cards generally beat 6950XT custom cards and the power spiking on 6950XT was still a big problem and only got worse, some 6950XT spiked to 650-700 watts

6950XT was nothing but an overclocked 6900XT since chip is 100% identical


Nvidia did not have drivers killing any GPUs, they were defective from the beginning and AMD cards died too from Diablo 4 and other games. More Nvidia cards died because more people use Nvidia.

AMD gimped tons of GPUs with PCIe lanes as well.

AMD released tons of bad GPUs, just as Nvidia, especially in the lower end segment ->www.techspot.com/review/2398-amd-radeon-6500-xt/
To make such a statement I assume you analyzed the driver code and the entire long list of reported cases of GPUs dying shortly after the driver was released that Nvidia magically took offline after the first cases appeared. This happened not once, but twice as I recall.

None of you can maintain any argumentative cohesion, because at that point you begin to despair of proving which brand is worse, and bring the discussion to irrelevant topics. I completely agree that AMD shouldn't have released such horrible low-end GPUs.

However, I'm arguing around the point that the problems Nvidia has, and these are downplayed and ignored, not that AMD doesn't have or had problems;
Dr. DroIf the only dirt you've got is a decade old GPU's "misfortune" of being relatively inefficiently designed, or a "killer driver" released even prior to that, it's little wonder that they've amassed a mindshare equivalent to Apple. Personally I've been bitten by the AMD bugs more times than I could count. I vehemently disagree with most of their business directives of late. I gave them a timeout. I still very much love AMD, but I just need some time off - like any heated relationship I suppose.
Mindshare is fascinating, it consists of first offering products with some difference and quality compared to the competition, and this takes root in people, they recommend it to friends and family, they speak well of the brand, making the most effective marketing possible; After achieving this, at some point you may no longer offer any difference, in fact now your products have deficits and are more expensive, but you have gained a group of loyal followers, who simply buy brand X or Y without even checking any competing products. .

In short, it's not about having the best products, it's about making people believe that you do. Something starts as rational and becomes emotional.

As a casual gamer, I've honestly never had serious problems with AMD drivers, but I'm not pointing fingers and calling anyone who says they have a fanboy, software will always have problems. Even though I've had serious problems with Nvidia in the past, I don't spend my time constantly saying that their software is garbage.
Posted on Reply
#66
Chrispy_
Dr. DroThe cards have been updated with 2x6 connectors that have shorter sense pins. They are not hazardous. The reason you have to buy it over FE is... that it's available and the FE isn't. Unless you're in the privileged markets Nvidia bothers to sell them at.
The cards may have been updated, but all the power supplies and cables manufactured in the last 3 years haven't magically been changed to the latest 2024 standard. Even if you buy a brand new PSU today, you won't necessarily know when your PSU was manufactured until you open the packaging and look for the date of manufacture near the serial number. Realistically most things I buy have sat on a warehouse shelf somewhere for 3-12 months, even if it was brand new stock to the retailer last week.

I don't trust it. Not because I am an expert in the field, but because people who are (like Der8auer who works as a consultant/employee for CableMod actually designing cables that they don't want to melt) has numerous criticisms of it. If the people whose job it is to make safe cables don't trust the connector, why should I trust the connector? Unlike Nvidia who have damage control and negative PR reasons to lie and cover up their mistake, CableMod/Der8auer have nothing to lose by being honest.
Posted on Reply
#67
Why_Me
Looking forward to the reviews. I think the 4070 Ti Super 16GB has a chance of being a real hit for those who game at 1440P.
Posted on Reply
#68
Assimilator
DenverHowever, I'm arguing around the point that the problems Nvidia has, and these are downplayed and ignored, not that AMD doesn't have or had problems;
No you're not; you're bringing up problems that NVIDIA had in the past, and have solved. NVIDIA's problems aren't downplayed, they're forgotten because the company generally fixes them as soon as possible. Meanwhile AMD consistently makes the same dumb mistakes over and over and over again. That leads to a perception of quality from NVIDIA, and a perception of the opposite from AMD, and AMD does nothing to address that perception; they just increase their prices to match NVIDIA, then wonder why nobody wants to buy AMD.
DenverAnd you now and almost always don't add anything to the discussion, you just come and talk bad about people and label one or the other a fanboy, as if that were your only existential reason.
Do you know when the GTX 970 was released? A decade ago. It sold incredibly well because the "3.5GB" issue that AMD fanboys love to talk about simply was not an issue for people who actually owned the card and used it for gaming. Is it a negative point? Yes. Is it a negative point that practically ever mattered to users? No. Did NVIDIA ever repeat that design? No. So bringing it up as if it somehow matters... just stop. Please, stop embarrassing yourselves.

In contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.

This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
Posted on Reply
#69
Lew Zealand
BwazeAMD isn't really commited to their graphics section. Sure, they're investing, keeping in close second place, but not really inovating, and not competing for bigger market share...
The 7700 XT thru 7900 XTX are chiplet-based GPUs, that alone is a big innovation not seen from any other consumer GPUs. That's not to say AMD should be content with that and not get other aspects of their GPUs to perform better like low-demand power consumption, but ignoring what AMD has actually innovated with mutes your other criticisms.

With such a change to GPU hardware design it's not a surprise that some things have turned out worse than people hoped, like top end core frequencies/performance (which AMD touted and then didn't deliver) and low-demand power use, but hopefully for them it will be a case of design and iterate for the next generation.

Nvidia took no such chances with the 4000 series because they didn't need to as moving from Samsung's crap "8" to TSMC's 5nm process meant they would clearly take the lead in performance with a 600 mmsq die as well as smaller dies. We'll see what Blackwell brings but moving to TSMC 3nm will likely require no significant hardware innovation as the increased density from a node shrink will be sufficient to compete with AMD unless AMD can make GPU Chiplets 2.0 work significantly better than 1.0.
Posted on Reply
#70
Assimilator
Lew ZealandWe'll see what Blackwell brings but moving to TSMC 3nm will likely require no significant hardware innovation as the increased density from a node shrink will be sufficient to compete with AMD unless AMD can make GPU Chiplets 2.0 work significantly better than 1.0.
I'm willing to give AMD another generation, or even two, on the GPU chiplet R&D road because NVIDIA (and Apple) are staring at 3nm with no path currently beyond. If AMD can do GPU chiplets right they will be in the same driving seat they were in with Zen vs Intel... the problem is whether the GPU division can last long enough to get there.
Posted on Reply
#71
Denver
AssimilatorNo you're not; you're bringing up problems that NVIDIA had in the past, and have solved. NVIDIA's problems aren't downplayed, they're forgotten because the company generally fixes them as soon as possible. Meanwhile AMD consistently makes the same dumb mistakes over and over and over again. That leads to a perception of quality from NVIDIA, and a perception of the opposite from AMD, and AMD does nothing to address that perception; they just increase their prices to match NVIDIA, then wonder why nobody wants to buy AMD.


Do you know when the GTX 970 was released? A decade ago. It sold incredibly well because the "3.5GB" issue that AMD fanboys love to talk about simply was not an issue for people who actually owned the card and used it for gaming. Is it a negative point? Yes. Is it a negative point that practically ever mattered to users? No. Did NVIDIA ever repeat that design? No. So bringing it up as if it somehow matters... just stop. Please, stop embarrassing yourselves.

In contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.

This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
The critical issue, at least for me, lies in the lack of transparency during the launch of the GTX 970, not for performance reasons. Consumers trusted the information provided by the company when purchasing the product, expecting the 4 GB of memory advertised to be fully accessible and of uniform performance. The discovery that 0.5 GB of this memory was significantly slower than the rest resulted in a feeling of disappointment.

Lying or omitting information about the characteristics of any product is not a small problem, no wonder this happened: "NVIDIA settled in a 2015 class-action lawsuit against it, for misrepresenting the amount of memory on GeForce GTX 970 graphics cards. The company has agreed to pay every buyer of the card USD $30 (per card), and also cover the legal fees of the class, amounting to $1.3 million. The company, however, did not specify how much money it has set aside for the payout, and whether it will compensate only those buyers who constitute the class (i.e. buyers in the U.S., since that's as far as the court's jurisdiction can reach), or the thousands of GTX 970 buyers worldwide." NVIDIA Settles Class-Action Lawsuit Over GTX 970 Memory | TechPowerUp

You assume a lot of things without knowing, when I abandoned Nvidia in the Maxwell gen(980), the driver had a problem with a specific game, which prevented me from playing, Mass Effect, I waited for a year for a fix, which didn't come, to my surprise googling out of pure curiosity I discovered that to date the problem has not been corrected;

I'm not like you who jump to make excuses for Nvidia, I always criticize what deserves criticism, regardless of the flag, for me Vega was a waste of money just like all HBM GPUs aimed at consumers, for me they should have jumped and focused all resources in advancing the launch of RDNA, and preparing the architecture transition, leaving the software side in a better state for launch. Vega was only good as an iGPU.

If you don't want to have problems, it's better to buy a console, the chances of running into bugs that prevent you from playing are lower.
Posted on Reply
#72
MarsM4N
bugJust like sour grapes, when AMD falls behind, they claim they're not interested in building big chips or competing at the high end anymore. When they launch a new architecture, they promptly forget about that. Rinse and repeat.

I have nothing personal against AMD (quite the opposite). It's just that their GPUs have been trailing Nvidia for too long. And that's the lipstick their marketing puts on the pig.
Tbh. I think AMD didn't exit high end, they just sugar coated the obvious. :D Nvidia is so far ahead with the 4090, there is no way to catch up. And I bet that isn't even the max. Nvidia could go.
AssimilatorIn contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.

This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
Tbh. both companies produced enough turds to fill a dump. ;) Most of the "AMD has bad drivers" is nothing more than a sour taste from the Vega(?) generation which had massive problems. Nowadays when looking at gaming support AMD isn't trailing Nvidia, I see them rather leading since AMD also dominates the console market. In fact I see more Nvidia users posting driver problems for new released games than AMD users. But there are problems, and they need to be called out. Which sadly often doesn't happen by so called tech reviewers. They fail to inform the "average joe" about issues, so I mainly blame them why stuff doesn't get fixed.


It all comes down to what is more important to you:
AMD: best bang for buck, better multi monitor support, better Linux support, better GPU software, better day1 drivers, fine wine driver progress
Nvidia: way better power efficiency, better frame generation, way more (non gaming) software features, expensive, better resale value
Posted on Reply
#73
95Viper
Stay on topic.
Stop the AMD vs. Nvidia BS.
Stop calling others derogatory names.
Discuss the thread civilly.
Posted on Reply
#74
las
BwazeAs I see it, Nvidia right now doesn't even need to sell gaming cards. AIB partners do. But with the crypto craze rhey could cash in, because miners needed gaming cards - any cards at any price, when they were desperate enough.

AI craze is different. Almost nobody needs gaming cards - except where there are limitations on selling AI accelerators, and even that is very limited.

So I'm a bit sorry for all AIB partners now. I bet we'll even see prices go much higher than MSRP - because all the cards will be produced in very small numbers, Nvidia has already said it's dedicating production to AI.

And no gamer will be buying the overpriced cards, and AI people don't need them.

And suddenly EVGAs decision in September 2022 will make perfect sense.
4090 sells like hotcakes really, which is why price went up, so this does not affect all "gaming" cards, even tho 4090 is kind of a hybrid here

Gamers can be thankful that the lower tier GPUs are not interresting for AI or you would see the same thing happen as back during the mining craze. 90-95% of PC gamers buy in the sub 1000 dollar segment anyway.

EVGA stopped making GPUs because they became worse and worse over the years, their designs in general were lacking in the last years they were active in the GPU market, with several issues on both vrm, pcb and design in general, they were not really selling alot of cards and could not afford to stay competitive, so they pulled the plug and focussed on other areas. EVGA is not doing well right now. I would not be surprised if the company is sold or closes down in a few years. Selling OEM PSUs and cheap stuff like mice and keyboards is not going to work well. Their mice and keyboards are not even great and their PSUs also dropped in quality since their entry in this market with the first Super Flower designs.

Stop selling GPUs was the first nail in the coffin for EVGA most likely. I don't see them survive long without it. It was what they became known for.

AIBs earn tons of money right now. I know for sure since i work with B2B in the hardware sector. Numbers right now are climbing not dropping and they will go up way more in 24 and 25.
BwazeThis doesn't make much sense. Right now everyone is paying all the very expensive process upgrades for TSMC, upgrades that are allegedly rising astronomically with every new process. So how can anyone just leapfrog into cutting edge process without figuring out intermediate steps?

I believe it would take a severe industry crysis to end the TSMC supremacy. But that's not so unimaginable, reasons could be political (Chinese don't even have to blocade or invade Taiwan, just stop exporting crucial materials and components), caused by natural disasters, or just market response to too high cost.
Intel have been building fabs for years to regain lead. Pat Gelsinger is turning Intel around as fast as he can.

Intel 4 this year (Meteor Lake), Intel 20A in Q4 (Arrow Lake) and then 18A next year.

TSMC hit a wall as well and it struggling to go lower than 3nm right now (Which Apple has priority on).

However, Apple wants chip production outside of Asia only, which is why they forced TSMC to build more fabs outside of Asia.

Apple would jump to Intel for sure when Intel regain lead. Both are US companies.

It makes perfect sense if you read news and officials statements in the last few years. Intel has always came back eventually.

Also, TSMC has been pushing up prices as well. They will be forced to cut prices when Intel regains lead, or atleast get to same processes
Posted on Reply
#75
Bwaze
las4090 sells like hotcakes really, which is why price went up, so this does not affect all "gaming" cards, even tho 4090 is kind of a hybrid here
Not to gamers they don't. Unless you live in a very wealthy part of the world. As far as I know RTX 4090 were starting to dissappear from the overstocked warehouses right about when USA started limiting what China could import as their AI accelerators. Coincidence, I'm sure.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:11 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts