Sunday, January 14th 2024
GeForce RTX 40 SUPER Custom Model Pricing Leaks Out
NVIDIA revealed basic price guides for GeForce RTX-40 SUPER graphics cards at the recently concluded CES 2024 trade show, but their board partners largely stayed coy about figures for customized options (review embargoes will be lifted soon). ZOTAC broke the mold later on in the week, with press material updated to reflect that non-overclocked models will adhere to Team Green's basic MSRP. However, premium charges for overclocked SUPER Twin Edge OC, AMP HOLO, Trinity OC and AMP Extreme AIRO cards remain a mystery. VideoCardz decided to conduct some weekend detective work, and fiddled around on Newegg and Best Buy online stores—although the focus shifted to other brands/manufacturers.
Workaround methods were implemented in order to prematurely extract card prices, before NVIDIA's staggered schedule of reveals for customized versions of the GeForce RTX 4070 SUPER, RTX 4070 Ti SUPER and RTX 4080 SUPER GPUs (throughout January). The leaked results show that GIGABYTE and PNY have custom overclocked GeForce RTX 4070 SUPER models targeting base MSRP at $599, while MSI has several options exceeding that base level—ranging from $10 to $50 premiums. GIGABYTE's GAMING OC card also tops the table at $649. Jumping up to the GeForce RTX 4070 Ti SUPER tier, we see a GIGABYTE Gaming OC model sitting at $849.99 and an MSI VENTUS 3X OC going for $899.99. The sole custom GeForce RTX 4080 SUPER within the VideoCardz article appears to be an MSI VENTUS 3X OC; we are witnessing a $100 extra tacked on for this design.VideoCardz has kindly compiled their findings into list form:
NVIDIA GeForce RTX 4070 SUPER 12 GB MSRP: $599
Source:
VideoCardz
Workaround methods were implemented in order to prematurely extract card prices, before NVIDIA's staggered schedule of reveals for customized versions of the GeForce RTX 4070 SUPER, RTX 4070 Ti SUPER and RTX 4080 SUPER GPUs (throughout January). The leaked results show that GIGABYTE and PNY have custom overclocked GeForce RTX 4070 SUPER models targeting base MSRP at $599, while MSI has several options exceeding that base level—ranging from $10 to $50 premiums. GIGABYTE's GAMING OC card also tops the table at $649. Jumping up to the GeForce RTX 4070 Ti SUPER tier, we see a GIGABYTE Gaming OC model sitting at $849.99 and an MSI VENTUS 3X OC going for $899.99. The sole custom GeForce RTX 4080 SUPER within the VideoCardz article appears to be an MSI VENTUS 3X OC; we are witnessing a $100 extra tacked on for this design.VideoCardz has kindly compiled their findings into list form:
NVIDIA GeForce RTX 4070 SUPER 12 GB MSRP: $599
- NVIDIA Founders Edition: $599.99
- GIGABYTE WindForce OC: $599.99
- PNY VERTO OC: $599.99
- ZOTAC Twin Edge: $599.99
- MSI VENTUS 2X OC: $609.99
- MSI VENTUS 2X OC WHITE: $619.99
- MSI VENTUS 3X OC: $629.99
- MSI Gaming X Slim: $649.99
- GIGABYTE GAMING OC: $649.99
- GIGABYTE Gaming OC: $849.99
- MSI VENTUS 3X OC: $899.99
- NVIDIA Founders Edition: $999.99
- MSI VENTUS 3X OC: $1099.99
86 Comments on GeForce RTX 40 SUPER Custom Model Pricing Leaks Out
I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
The main reason prices went up are inflation, production costs, shipping costs and TSMC demanding a higher and higher cut.
If AMD wants to be the good guy, they can drop 7900XTX to 799 and 7900XTX to 599, they would probably loose money by doing that tho. They don't care about GPUs much.
With 4080 SUPER coming in at 1000, 7900XTX should go sub 800 ASAP. Might even be too much as long as features are highly inferior.
Such a myth that Radeon drivers' idle and multi-monitor power consumption have been consistently broken on every new GPU release since Vega, or four generations. Read any of W1zz's launch day reviews of these cards.
Take your historical revisionism and shove it.
www.techpowerup.com/review/asrock-radeon-rx-7900-xt-phantom-gaming-white/31.html
6900XT is not close to 3090 Ti really. 6950XT did not beat 3090 Ti overall and mostly delivered 3090 performance, if you looked at overall performance that is.
www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/30.html
6950XT performed like 5-6% better than 6900XT stock for stock.
www.techpowerup.com/review/asrock-radeon-rx-7900-xt-phantom-gaming-white/37.html
However the big problem with 6800 and 6900 series was power spiking, look at 20ms spikes, this is what destroys PSUs over time or make the systems reboot. AMD fixed this with 7000 series.
However 3090 Ti was crazy peaked and used way too much power. One of the worst GPU purchases in recent memory, because 4090 landed like 6 months later with almost twice the performance at much much lower power usage with a 1599 price tag instead of 1999 dollars.
I think Las doesn’t realize that AMD ‘skipped’ the high end occasionally (Sea Island, Polaris, RDNA1) but also competed for the high end for the most part. AMD could ‘skip’ the high end again for next gen but market forces determine business and rarely does someone leave a market segment. But It does happen. Intel is facing that rare tough choice currently. You cannot win this argument as a simple internet search conforms that no company is immune to driver problems and no company is ahead when it comes to good/bad drivers. Here is one such forum among many:
computers/comments/1761ayr
Drivers have bugs. Its always been that way. Now what some people are confusing are driver features such as super sampling. Some companies have better features implemented in a superior way. Nvidia definitely has better features. But that doesn’t mean AMD drivers are bad.
It's the same design, but the 6950XT chip had better binning. 4-6% difference, some reviewers consider it a 5% margin of error, and depending on the game selection it could win or lose the battle for the top. But the main point is that the 3090ti cost $2000 and still sold much more than the 6950XT, Nvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.
6950XT was nothing but an overclocked 6900XT since chip is 100% identical
Nvidia did not have drivers killing any GPUs, they were defective from the beginning and AMD cards died too from Diablo 4 and other games. More Nvidia cards died because more people use Nvidia.
AMD gimped tons of GPUs with PCIe lanes as well.
AMD released tons of bad GPUs, just as Nvidia, especially in the lower end segment ->www.techspot.com/review/2398-amd-radeon-6500-xt/
But the founder edition is hardly available outside 'murica anyways, so for most of the world, the cheapest partner cards are the real price.
Ventus is the base model;
- Single BIOS,
- Basic VRM with the minimum acceptable VRM design for stock speeds.
- Cheapest possible cooler they can make with minimum acceptable heatpipe count and only the essential PCB components contacted.
- No RGBLED
- The lowest-clocked model in MSI's range and usually within 1-2% of Nvidia's reference board base clocks.
- Default power limits with no headroom.
- Shorter warranties than their premium models (in territories where it's legal to offer shorter warranties).
...and now, to make things worse than just the stupid price hike, the Ventus now comes with the shitty 12VHPWR connector that's riddled with issues, so there's NO REASON to buy it over the FE.I have nothing personal against AMD (quite the opposite). It's just that their GPUs have been trailing Nvidia for too long. And that's the lipstick their marketing puts on the pig.
None of you can maintain any argumentative cohesion, because at that point you begin to despair of proving which brand is worse, and bring the discussion to irrelevant topics. I completely agree that AMD shouldn't have released such horrible low-end GPUs.
However, I'm arguing around the point that the problems Nvidia has, and these are downplayed and ignored, not that AMD doesn't have or had problems; Mindshare is fascinating, it consists of first offering products with some difference and quality compared to the competition, and this takes root in people, they recommend it to friends and family, they speak well of the brand, making the most effective marketing possible; After achieving this, at some point you may no longer offer any difference, in fact now your products have deficits and are more expensive, but you have gained a group of loyal followers, who simply buy brand X or Y without even checking any competing products. .
In short, it's not about having the best products, it's about making people believe that you do. Something starts as rational and becomes emotional.
As a casual gamer, I've honestly never had serious problems with AMD drivers, but I'm not pointing fingers and calling anyone who says they have a fanboy, software will always have problems. Even though I've had serious problems with Nvidia in the past, I don't spend my time constantly saying that their software is garbage.
I don't trust it. Not because I am an expert in the field, but because people who are (like Der8auer who works as a consultant/employee for CableMod actually designing cables that they don't want to melt) has numerous criticisms of it. If the people whose job it is to make safe cables don't trust the connector, why should I trust the connector? Unlike Nvidia who have damage control and negative PR reasons to lie and cover up their mistake, CableMod/Der8auer have nothing to lose by being honest.
In contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.
This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
With such a change to GPU hardware design it's not a surprise that some things have turned out worse than people hoped, like top end core frequencies/performance (which AMD touted and then didn't deliver) and low-demand power use, but hopefully for them it will be a case of design and iterate for the next generation.
Nvidia took no such chances with the 4000 series because they didn't need to as moving from Samsung's crap "8" to TSMC's 5nm process meant they would clearly take the lead in performance with a 600 mmsq die as well as smaller dies. We'll see what Blackwell brings but moving to TSMC 3nm will likely require no significant hardware innovation as the increased density from a node shrink will be sufficient to compete with AMD unless AMD can make GPU Chiplets 2.0 work significantly better than 1.0.
Lying or omitting information about the characteristics of any product is not a small problem, no wonder this happened: "NVIDIA settled in a 2015 class-action lawsuit against it, for misrepresenting the amount of memory on GeForce GTX 970 graphics cards. The company has agreed to pay every buyer of the card USD $30 (per card), and also cover the legal fees of the class, amounting to $1.3 million. The company, however, did not specify how much money it has set aside for the payout, and whether it will compensate only those buyers who constitute the class (i.e. buyers in the U.S., since that's as far as the court's jurisdiction can reach), or the thousands of GTX 970 buyers worldwide." NVIDIA Settles Class-Action Lawsuit Over GTX 970 Memory | TechPowerUp
You assume a lot of things without knowing, when I abandoned Nvidia in the Maxwell gen(980), the driver had a problem with a specific game, which prevented me from playing, Mass Effect, I waited for a year for a fix, which didn't come, to my surprise googling out of pure curiosity I discovered that to date the problem has not been corrected;
I'm not like you who jump to make excuses for Nvidia, I always criticize what deserves criticism, regardless of the flag, for me Vega was a waste of money just like all HBM GPUs aimed at consumers, for me they should have jumped and focused all resources in advancing the launch of RDNA, and preparing the architecture transition, leaving the software side in a better state for launch. Vega was only good as an iGPU.
If you don't want to have problems, it's better to buy a console, the chances of running into bugs that prevent you from playing are lower.
It all comes down to what is more important to you:
AMD: best bang for buck, better multi monitor support, better Linux support, better GPU software, better day1 drivers, fine wine driver progress
Nvidia: way better power efficiency, better frame generation, way more (non gaming) software features, expensive, better resale value
Stop the AMD vs. Nvidia BS.
Stop calling others derogatory names.
Discuss the thread civilly.
Gamers can be thankful that the lower tier GPUs are not interresting for AI or you would see the same thing happen as back during the mining craze. 90-95% of PC gamers buy in the sub 1000 dollar segment anyway.
EVGA stopped making GPUs because they became worse and worse over the years, their designs in general were lacking in the last years they were active in the GPU market, with several issues on both vrm, pcb and design in general, they were not really selling alot of cards and could not afford to stay competitive, so they pulled the plug and focussed on other areas. EVGA is not doing well right now. I would not be surprised if the company is sold or closes down in a few years. Selling OEM PSUs and cheap stuff like mice and keyboards is not going to work well. Their mice and keyboards are not even great and their PSUs also dropped in quality since their entry in this market with the first Super Flower designs.
Stop selling GPUs was the first nail in the coffin for EVGA most likely. I don't see them survive long without it. It was what they became known for.
AIBs earn tons of money right now. I know for sure since i work with B2B in the hardware sector. Numbers right now are climbing not dropping and they will go up way more in 24 and 25. Intel have been building fabs for years to regain lead. Pat Gelsinger is turning Intel around as fast as he can.
Intel 4 this year (Meteor Lake), Intel 20A in Q4 (Arrow Lake) and then 18A next year.
TSMC hit a wall as well and it struggling to go lower than 3nm right now (Which Apple has priority on).
However, Apple wants chip production outside of Asia only, which is why they forced TSMC to build more fabs outside of Asia.
Apple would jump to Intel for sure when Intel regain lead. Both are US companies.
It makes perfect sense if you read news and officials statements in the last few years. Intel has always came back eventually.
Also, TSMC has been pushing up prices as well. They will be forced to cut prices when Intel regains lead, or atleast get to same processes