• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

Joined
May 11, 2018
Messages
1,287 (0.53/day)
Jensrn Huang: "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." What he actually meant is that we shouldn't expect semiconductors to be as cheap as they've been in the past, although part of the issue NVIDIA is having is that their products have to be produced on cutting edge notes, which cost significantly more than more mature nodes.
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
The issue is their margins, JHH has made Nvidia the Apple of "PC" world & we all know where that leads us!
 
Joined
Aug 12, 2019
Messages
2,241 (1.15/day)
Location
LV-426
System Name Custom
Processor i9 9900k
Motherboard Gigabyte Z390 arous master
Cooling corsair h150i
Memory 4x8 3200mhz corsair
Video Card(s) Galax RTX 3090 EX Gamer White OC
Storage 500gb Samsung 970 Evo PLus
Display(s) MSi MAG341CQ
Case Lian Li Pc-011 Dynamic
Audio Device(s) Arctis Pro Wireless
Power Supply 850w Seasonic Focus Platinum
Mouse Logitech G403
Keyboard Logitech G110
i think the xx70 and xx60 are gonna have 192-bit
the xx50 will have 128-bit
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
12GB, >500 Gbps is not 60 series level, come on.
Shader count isn't either, and 300 for an x70 isn't realistic to begin with.

Let's refresh our memories a bit - a crippled 3.5GB 970 was already MSRP 329,-
Nine years ago.

View attachment 289439

But we all know this x70 won't release for 400-450, it'll do 550+ at least.

The only problem is that the whole point is that newer cards come faster to replace the older ones at the same price point. If you want to go further down memory lane, though, let's go back to 2006, you needed the G80 with a full 384-bit interface to achieve 86.4 GB/s (the bandwidth of the Quadro FX 5600/pro version of 8800 GTX). You'd find that raw memory bandwidth can be outgunned by a very simple and inexpensive 64-bit GPU design with only two chips installed today such as the RX 6400 by over 40%(!), and if that wasn't enough, there's still the extreme bandwidth uplift afforded by the large cache even on that low-end product. Performance then... it's on a level that GPU engineers only dreamed of back then.

I think that calling it a 4060 is actually generous, because it'd be a 4060 at best if the market was healthy. This chart from back when Ada was revealed is a relative of execution units enabled in the processor per released product over the architectures since Kepler, in percent relative to the full die possible:

1667991997478.png


(credit to whoever made and posted this, I saved it from some post here on TPU - mad useful btw)

As you can see: 4090 is only 88% enabled despite being the top product, there is an extreme gap between it that would fit practically the entire Ampere and Turing lineups between it and the 4080, 4070 Ti was still referred to as 4080 12G here, but yes, its full AD104 configuration is only 42.5% relative to a full AD102. It's probably not accurate to 4070 and below as these were very far when this chart was created, but it's quite possible to update it without major difficulty.

All in all, this is a very, very lukewarm midranger that JHH is positioning as a performance segment product, thanks to lack of competition from AMD, and the worst market conditions you could imagine - it's probably worse now than it was during the crypto boom because at least back then they had an excuse to price gouge.
 
Joined
May 11, 2018
Messages
1,287 (0.53/day)
All in all, this is a very, very lukewarm midranger that JHH is positioning as a performance segment product, thanks to lack of competition from AMD, and the worst market conditions you could imagine - it's probably worse now than it was during the crypto boom because at least back then they had an excuse to price gouge.

And now they have a new one. Nvidia has fully embraced that AI will have tremendous needs for graphics cards - on all levels, from large servers with professional accelerators to small home users with "cards previously called gaming".

By the next financial report I fully expect they will include this even in sector naming. "Gaming" all of a sudden won't be adequate for a sector that will sell accelerators to small AI servers and home AI users, so I expect they'll rename it to something it will cover both gaming and AI acceleration, and they'll have full mouths of AI, even if it will only be a small portion of their sales. The potential for growth is almost endless.

Of course AI won't bring much revenue for home users or small nodes, or at least it won't look as lucrative as home mining, so I doubt we'll see a very quick adoption. But while at mining Nvidia had to appear as if the miners are abusing their products and using it for something it wasn't intended to do, with AI they can wholly embrace it!

So the next generation you'll try to buy a graphics card, it won't be a gaming card any more. It will be a Home Accelerator for AI and Gaming.
 
Last edited:
Joined
Nov 15, 2020
Messages
929 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Money aside, 12GB is likely not enough for gaming. Modern games demand more, particularly if Nvidia is playing the trump card of ray tracing. Essentially anyone who buys a 4070 12 GB card can forget ray tracing.
 
Joined
Dec 14, 2011
Messages
1,080 (0.23/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
Money aside, 12GB is likely not enough for gaming. Modern games demand more, particularly if Nvidia is playing the trump card of ray tracing. Essentially anyone who buys a 4070 12 GB card can forget ray tracing.

Yes; 12GB VRAM should have been the bare minimum today, especially when you are going to use Raytracing in your marketing material for GPUs. Memory compression has gotten a lot better to allow for lower bus configurations, but that memory capacity... it's atrocious. I wanted to sell my RTX3070Ti and add a few extra bucks for an RTX4060Ti with at least 12GB VRAM capacity, but thanks to Nvidia's idiocy once again, they only added a measly 8GB VRAM, hard pass.

I will probably get an RTX7070Ti in the future, that is, if the price is within acceptable parameters at the time, and I, of course, am still on this greed-filled dustball.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I could go as high as $500. Anything above that, idgaf.
Then again, $500 for a custom model means you got the MSRP right.
....aaand here it is at 750 instead :D

But yeah, similar thoughts. Anything above 500 is running into some psychological barrier here. The one called common sense.
 
Joined
Jan 14, 2019
Messages
12,548 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
5888 CUDA cores with 12 GB VRAM? That sounds like everything the 3070 should have been. It would sell like hotcakes for $400... except it won't be $400 because it's Nvidia.
 
Joined
May 31, 2016
Messages
4,438 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
The only problem is that the whole point is that newer cards come faster to replace the older ones at the same price point. If you want to go further down memory lane, though, let's go back to 2006, you needed the G80 with a full 384-bit interface to achieve 86.4 GB/s (the bandwidth of the Quadro FX 5600/pro version of 8800 GTX). You'd find that raw memory bandwidth can be outgunned by a very simple and inexpensive 64-bit GPU design with only two chips installed today such as the RX 6400 by over 40%(!), and if that wasn't enough, there's still the extreme bandwidth uplift afforded by the large cache even on that low-end product. Performance then... it's on a level that GPU engineers only dreamed of back then.

I think that calling it a 4060 is actually generous, because it'd be a 4060 at best if the market was healthy. This chart from back when Ada was revealed is a relative of execution units enabled in the processor per released product over the architectures since Kepler, in percent relative to the full die possible:

View attachment 289544

(credit to whoever made and posted this, I saved it from some post here on TPU - mad useful btw)

As you can see: 4090 is only 88% enabled despite being the top product, there is an extreme gap between it that would fit practically the entire Ampere and Turing lineups between it and the 4080, 4070 Ti was still referred to as 4080 12G here, but yes, its full AD104 configuration is only 42.5% relative to a full AD102. It's probably not accurate to 4070 and below as these were very far when this chart was created, but it's quite possible to update it without major difficulty.

All in all, this is a very, very lukewarm midranger that JHH is positioning as a performance segment product, thanks to lack of competition from AMD, and the worst market conditions you could imagine - it's probably worse now than it was during the crypto boom because at least back then they had an excuse to price gouge.
That graph shows exactly what tier this 4070 is. Heck, it even shows where the 4080 12GB stacks at in comparison to previous products between 3060 and 3060 ti. It does not look appealing to me to be honest with all the segmentation NV did this gen considering % of the die used. The 4080 16GB looks like it is a 4070 to me. The latter one is a performance level of a 3060.
Lets say it is an eye opener.
 
Joined
Dec 14, 2011
Messages
1,080 (0.23/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
That graph shows exactly what tier this 4070 is. Heck, it even shows where the 4080 12GB stacks at in comparison to previous products between 3060 and 3060 ti. It does not look appealing to me to be honest with all the segmentation NV did this gen considering % of the die used. The 4080 16GB looks like it is a 4070 to me. The latter one is a performance level of a 3060.
Lets say it is an eye opener.

Nvidia does this every now and then, enabling them to raise the prices of lower-tiered cards without most customers noticing, some of us do, and despise them for it.
 
Joined
May 31, 2016
Messages
4,438 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Nvidia does this every now and then, enabling them to raise the prices of lower-tiered cards without most customers noticing, some of us do, and despise them for it.
It is a company so ripping people off they have in their blood. Is it something to despise? Not really. Surely it is laughable.
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That graph shows exactly what tier this 4070 is. Heck, it even shows where the 4080 12GB stacks at in comparison to previous products between 3060 and 3060 ti. It does not look appealing to me to be honest with all the segmentation NV did this gen considering % of the die used. The 4080 16GB looks like it is a 4070 to me. The latter one is a performance level of a 3060.
Lets say it is an eye opener.

The worst thing is that graph still omits quite a few very, very important things that NVIDIA can do to leverage even more out of Ada silicon. The 4090, for example has 12% of its shaders disabled, but also 25% of its L2 cache slices compared to a full AD102 processor. They'll also have 3rd generation 24 Gbps GDDR6X modules on tap, current RTX 40 series cards use the same 2nd generation 21 Gbps modules that were used on the RTX 3090 Ti. If you had to compare it in terms of quality to any previous lower-range product built on the top tier silicon for its generation, it'd be a modern version of the infamous GTX 465. It's important to keep this in mind because the cache and memory bandwidth are proving to be the most crucial things to keep performance up in current generation RT-heavy games.

This would allow an eventual full AD102 Titan/4090 Ti card to significantly exceed the original 4090's performance (something the 3090 Ti was never able to do vs. the 3090 because the improvement lies in raising the power limit and halving the amount of memory chips used, lowering memory power consumption significantly - 3090 had 82 out of 84 and 3080 Ti 80 out of 84 SMs enabled, meaning there was never a significant gap in execution units between the three of these). The addition of 16 SMs (4090 has 128 out of 144 enabled), better memory and 24 MB of L2 disabled on the 4090 would potentially create a card that is at least 30% faster before any power limit was ever raised.

GDDR6X optimization (by using newer generation, faster modules), as well as low-level architectural tweaks can still be leveraged to bring improvements throughout the entire segment, and this is what infuriates me with the RTX 40 series, they have a brilliant architecture yet carved a product stack that is designed from the bottom up to entirely forgo generational performance per dollar improvements and make its lower budget options look so bad against the top tier one that people would be just compelled to buy the 4090 on a value proposition alone (this is just evil!) - and it looks like they've succeeded, the 4090 has outsold every other GPU from both them and AMD combined this generation.
 
Joined
Sep 10, 2018
Messages
6,958 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
The only problem is that the whole point is that newer cards come faster to replace the older ones at the same price point. If you want to go further down memory lane, though, let's go back to 2006, you needed the G80 with a full 384-bit interface to achieve 86.4 GB/s (the bandwidth of the Quadro FX 5600/pro version of 8800 GTX). You'd find that raw memory bandwidth can be outgunned by a very simple and inexpensive 64-bit GPU design with only two chips installed today such as the RX 6400 by over 40%(!), and if that wasn't enough, there's still the extreme bandwidth uplift afforded by the large cache even on that low-end product. Performance then... it's on a level that GPU engineers only dreamed of back then.

I think that calling it a 4060 is actually generous, because it'd be a 4060 at best if the market was healthy. This chart from back when Ada was revealed is a relative of execution units enabled in the processor per released product over the architectures since Kepler, in percent relative to the full die possible:

View attachment 289544

(credit to whoever made and posted this, I saved it from some post here on TPU - mad useful btw)

As you can see: 4090 is only 88% enabled despite being the top product, there is an extreme gap between it that would fit practically the entire Ampere and Turing lineups between it and the 4080, 4070 Ti was still referred to as 4080 12G here, but yes, its full AD104 configuration is only 42.5% relative to a full AD102. It's probably not accurate to 4070 and below as these were very far when this chart was created, but it's quite possible to update it without major difficulty.

All in all, this is a very, very lukewarm midranger that JHH is positioning as a performance segment product, thanks to lack of competition from AMD, and the worst market conditions you could imagine - it's probably worse now than it was during the crypto boom because at least back then they had an excuse to price gouge.

It would have been more accurate to do this graph by SM count. Cuda core is not a very accurate way to compare different architectures more so since Ampere drastically changed the amount per SM.

The 30 series was on the terrible Samsung 10nm+ they called 8nm a big reason Nvidia was able to offer semi decent pricing although you could say the 3090 was way overpriced given how much cheaper the process was vs TSMC 4n, The 4080/4090 are vastly superior products vs high end ampere in almost every way the issue is just pricing especially at the 80tier and lower. I like my 3080ti but it's always felt like a meh product even though it offered 40% more performance over my 2080ti although I will say part of that is how terrible it feels 4k vs 4k RT vs my 4090 after just one generation.

At the end of the day all that really matters is performance and the last 2 flagship vs flagship have been some of the largest increases over the last decade with the 4090 being the largest increase in performance since the 1080ti released over a half decade ago yet everyone is still crying because they are expensive. We've had 3 generations of terrible pricing in a row at this point so anyone who thinks we are going to go back to 10 series pricing I hope is not holding their breath. Regardless of how much more cut down the 4080 is compared to my 3080ti in my secondary pc I would still feel way better spending $1200 on it vs the $1400 I spent on my 3080ti even though that was nearly two years ago.

Although anyone who bought a 3090ti near launch really got shafted 2000 usd for that is just comical regardless of if it uses the full die or not even at it's eventual 1100 MSRP less than 6 months after release the 4080 is way better.

I do expect the 5000 series to be priced a little better especially if AMD figures out how to make dual GCD gpu's work for gaming. I'm still not holding my breath though.

Regardless this is a thread about the 4070 and unfortunately for most people that card is going to be a joke but at least Nvidia didn't cheap out to the point of the 30 series and give it 8GB of Vram again.
 
Joined
Dec 10, 2022
Messages
486 (0.66/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Nvidia getting away with murder, what the F*** is the competition even DOING?

Milking too
It's not the competition's job to stop nVidia from getting away with murder, it's our job as consumers. The only way to stop nVidia from getting away with murder is to buy a card that isn't made by nVidia. What do you expect their competition to do, buy an AMX from Brazil and bomb Santa Clara?

The competition is doing the only thing that they can do, produce great video cards. The thing is, if everyone and their mother is buying nVidia, what difference does it make?

If you have a Radeon or an Arc GPU under the hood, then you can be proud because it means that you're doing what is needed to make nVidia accountable. I've been doing the same thing since 2008 because I would feel like a pathetic loser if I complained about nVidia while supporting them at the same time.
Pricing their cards as high as the market will allow and in the case of the 7900XT at least 100 usd too expensive lol.
At least is right. The RX 7900 XT shouldn't be over $600USD.
I was pretty underwhelmed with the 7000 series to the point I'm not surprised at 4000 series pricing.
Seeing as the RTX 4080 and RTX 4090 both came out before RDNA3, I don't see how RDNA3 could have had any effect on nVidia's pricing. It was already in the stratosphere without anyone else's help.
I feel like AMD took at least a step back vs the 6000 series which in general competed better with Nvidia's 90 tier card. Not that the performance is bad it's actually pretty good but the 4080 is one of the most underwhelming nvidia cards from a price perspective literally a 71% price increase vs it's predecessor but even at the ridiculous 1200 usd msrp which is kinda sad because Nvidia left AMD with a huge window to obliterate the 4080/4070ti and at best they are matching them.
It's true, but the thing is that Radeons historically have obliterated nVidia from a value standpoint but people still bought nVidia. All that the lower prices did was reduce AMD's graphics revenue. Can you really blame them for not bothering with consumers who turned their backs on them over and over again? I'm honestly surprised that it took this long for them to do it. Their attitude is probably "If you sheep want to pay too much for a video card, by all means, we'll get on that bus!" and it's an attitude that consumers have earned by voting with their wallets over the years and choosing the most over-priced and bad-value cards on the market. If consumers had rewarded AMD for their better offerings, we wouldn't be in this mess to begin with. Everyone with a GeForce card under the hood has helped the current situation to form and has nobody but themselves to blame. At least people with Radeons (and Arcs for that matter) did their part to try to prevent the current situation from forming.
I really hope at the XX70 tier and lower the 7000 series is much more impressive where RT matters a lot less.
At this point, it doesn't matter what AMD does. It only matters if people become willing to stop paying through the nose for nVidia cards. I used to think like you do, that all AMD has to do is make Radeons far more attractive than GeForce. AMD did that very thing for years but people are stupid and wanted the brand that had "The fastest card in the world" (like that even matters when your budget is less than $300). Consider tha the RTX 3050, the worst value video card on the market today, is actually selling well!

If people are willing to pay through the nose for a card as weak as an RTX 3050, then all is already lost.
 
Joined
Sep 10, 2018
Messages
6,958 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
At least is right. The RX 7900 XT shouldn't be over $600USD.

Seeing as the RTX 4080 and RTX 4090 both came out before RDNA3, I don't see how RDNA3 could have had any effect on nVidia's pricing. It was already in the stratosphere without anyone else's help.

I don't usually like to get into what something should cost but I definitely agree the 7900XT would be way more exciting under 700 usd. The market will usually dictate what something should cost the 3090ti is a good example of that dropping 900 usd of its msrp in 6 months.

Both these gpu makers know long before what the competition performance targets are it isn't by chance that RDNA 3 almost performs the same as the 4070ti/4080 that was likely AMD target all along. ADA/RDNA3 were likely mostly finalized 2-3 years ago as far as performance targets although looking at AMD own presentation they missed their mark by quite a bit.
 
Joined
Dec 10, 2022
Messages
486 (0.66/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I don't usually like to get into what something should cost but I definitely agree the 7900XT would be way more exciting under 700 usd. The market will usually dictate what something should cost the 3090ti is a good example of that dropping 900 usd of its msrp in 6 months.
Honestly, I don't think that it would make any difference at this point. Too many people are programmed that no GPU is too expensive if its in a green box and no GPU is cheap enough if its in a red box. It's as I said earlier, people are actually buying the RTX 3050.
Both these gpu makers know long before what the competition performance targets are it isn't by chance that RDNA 3 almost performs the same as the 4070ti/4080 that was likely AMD target all along. ADA/RDNA3 were likely mostly finalized 2-3 years ago as far as performance targets although looking at AMD own presentation they missed their mark by quite a bit.
You could be right but nVidia has been acting as if Radeon doesn't exist. Intel acted the same way when they had total domination of the CPU market. They competed against themselves and refuse to acknowledge that AMD even existed.

That all changed when Zen kicked Intel in the cojones. The last time that nVidia was kicked in the cojones was when ATi released the HD 5000-series. It's been a very long time...
 
Joined
Jan 29, 2021
Messages
1,872 (1.32/day)
Location
Alaska USA
Money aside, 12GB is likely not enough for gaming. Modern games demand more, particularly if Nvidia is playing the trump card of ray tracing. Essentially anyone who buys a 4070 12 GB card can forget ray tracing.
Says who.

 
Joined
Nov 15, 2020
Messages
929 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Joined
Nov 15, 2020
Messages
929 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
The guy admits it only happens with that game and only when running RT with shadows turned on high.
Ok, put your head in the sand.
 
Joined
May 31, 2016
Messages
4,438 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
The guy admits it only happens with that game and only when running RT with shadows turned on high.
NV 4070Ti remarkable RT performance but not in this game today. And tomorrow?
There are more games with the same issue not just this one I suppose.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The guy admits it only happens with that game and only when running RT with shadows turned on high.
You're living in denial. Nvidia's lacking VRAM is a fact, unless you are content keeping up with their latest greatest. Since Turing VRAM relative to core power has been more than halved, and you can be damn sure it has affected how fast cards turn obsolete.

Nvidia also has several cards in recent history that had issues on the VRAM department. GTX 970; 660; for example, with asymmetrical buses, fell off in performance faster than their VRAM cap would indicate. Both have 0,5 GB wired to lower bandwidth. On the other end of the spectrum, Nvidia's best/most loved cards are always the ones that sport a higher VRAM than anything else in the stack: 980ti (6GB), 1080ti (11GB) being the best examples.

Today Nvidia is 'fixing' very low VRAM amounts with a bunch of cache, and its already showing a lot of difference between games in how that performs.
 
Joined
Mar 9, 2008
Messages
1,177 (0.19/day)
System Name KOV
Processor AMD 5900x
Motherboard Asus Crosshair Hero VIII
Cooling H100i Cappellix White
Memory Corsair 3600kHz 32GB
Video Card(s) RX 7900XT
Storage Samsung 970 evo 500gb, Corsair 500gb ssd, 500GB 840 pro & 1TB samsung 980pro. M2.SSD 960 evo 250GB
Display(s) ASUS TUF Gaming VG32VQ1B 165mhz, Dell S2721DGFA 27 Inch QHD 165mhz
Case Corsair 680x
Audio Device(s) ON Board and Hyper X cloud flight wireless headset and Bose speakers
Power Supply AX1200i Corsair
Mouse Logitech G502x plus lightspeed
Keyboard Logitech G915 TKL lightspeed and G13 gamepad
Software windows 11
Nvidia will make the price at whatever as unfortunately they know people will play for it.
 
Top