• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen GPUs: What Matters Most to You?

Next-Gen GPUs: What Matters Most to You?

  • Raster Performance

    Votes: 6,487 27.0%
  • RT Performance

    Votes: 2,490 10.4%
  • Energy Efficiency

    Votes: 3,971 16.5%
  • Upscaling & Frame Gen

    Votes: 662 2.8%
  • VRAM

    Votes: 1,742 7.3%
  • Pricing

    Votes: 8,667 36.1%

  • Total voters
    24,019
  • Poll closed .
Joined
Dec 12, 2016
Messages
1,948 (0.66/day)
Barely anyone knows what RT is? You should have a higher opinion of your fellow tech enthusiasts here. Most members here know what RT is and they are expressing what they would like to see in the next gen. If your accusation of spamming votes is that there are a large number of alt accounts being used here to pad the votes then I highly doubt that.

Besides that, if RT is a trivial concern then why are AMD and Intel committed to improving RT performance considerably in their next generation GPUs?
Where does it limit voting in the poll to just 'members'?
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Where does it limit voting in the poll to just 'members'?

It doesn't but I don't buy into Nvidia conspiracies to that extent. Why would Nvidia/Nvidia fans bother with padding the vote here about RT? Nvidia already has almost 90% of the market anyway. Now, as to the second part of my comment. Why would AMD and Intel be committed to considerably improving RT performance in their next generations if barely anyone knows what RT is or has any desire for it? Are they both being incompetent with their R&D funds? If that is what you think then I disagree. They are reacting to market demands to at least try to stay relevant.
 
Joined
Apr 2, 2011
Messages
2,848 (0.57/day)
So...I think this is a false choice. I don't want one thing...or I'd buy only the extreme end of cards. IE, if pricing was my only concern I'd buy an entry level GPU, if performance was my one metric I'd buy a halo product. Neither of these represent reality, where I'm more likely to choose (from the Nvidia side) a xx60, xx70, to xx80 depending upon the raw performance and associated price point as a function of one another. I'm unlikely to ever touch a xx50 or xx90 because they are either competing with iGPUs or cost so much for tiny improvements that it's silly to pony up the cost of a GPU and CPU and only get a GPU.

I chose price because it's the closest way to describe the balance of price to performance...which neither Nvidia nor AMD seem to care about anymore. The former because relabeling anything with "AI" makes it 10x more valuable, while the later has figured out halo products are only good for advertising you have the fastest thing...and it's a losing game between a duopoly to constantly strive for being the best with loss leading (or minimal profit margin most likely) products. It's better to sell 10 4080s than a single 4090 because many more people can afford the 4080...and the market for the 4090 is looking to squeeze value as they purchase in bulk...eating into that profit margin.


This really seems like a broader discussion about how AMD has announced their next gen has no halo products, Nvidia is only releasing the 5090 this year but plowing more into AI accelerator chips, and we as consumers are in another dark period...hopefully soon to be relieved and we'll see relief on the price gouging front. Lord knows when the AI craze dries up there's going to be a lot of unhappy consumers of core GPU products...who will hopefully vote with their wallets to penalize the companies who are at best fair weather friends to them.
 
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
It doesn't but I don't buy into Nvidia conspiracies to that extent. Why would Nvidia/Nvidia fans bother with padding the vote here about RT? Nvidia already has almost 90% of the market anyway. Now, as to the second part of my comment. Why would AMD and Intel be committed to considerably improving RT performance in their next generations if barely anyone knows what RT is or has any desire for it? Are they both being incompetent with their R&D funds? If that is what you think then I disagree. They are reacting to market demands to at least try to stay relevant.

The person who said it was Nvidia was jesting. Really anyone can bot spam nowadays thanks to AI. The idea that it's a random person with random reasons is very feasible it today's world.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Mind you it would be bad if video cards did start requiring you to run upscaling in order to fit the game into it's VRAM budget.
We run outta calculating power way earlier in most gaming scenarios anyway. One can search for ideal balance all they want, the truth stays: bang per buck is to be improved. This means addressing both die's skills, VRAM amounts and memory bandwidth, WITHOUT making it more expensive every generation. Why would NV do that if Intel and AMD trail like 7 generations behind?
you are speaking from a customer perspective without any empathy.
They are getting paid billions for doing the job they never finish. Why should I have empathy for thieves/underperformers?
There are a ton of hobbyists using these.
Well drop bombs onto AMD HQ then. Make them make the Greens sweat.
I don't understand the persistence to blame devs
Because we have both increasing amounts of VRAM and increasing amounts of not exactly well made titles. Quality falls decade by decade. How did their dads manage to produce state-of-the-art games with only 32 megabytes of VRAM to spare on a median consumer GPU? Why do they provide games that fail to look better than some average late 2000s title whilst also consuming a dozen gigabytes at FHD? Why not blaming them? NV are evil, true, but they are not the only ones inflicting excruciating pain upon the progress.
 
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
We run outta calculating power way earlier in most gaming scenarios anyway. One can search for ideal balance all they want, the truth stays: bang per buck is to be improved. This means addressing both die's skills, VRAM amounts and memory bandwidth, WITHOUT making it more expensive every generation. Why would NV do that if Intel and AMD trail like 7 generations behind?

HWUB's video on the topic demonstrate that even with a small bus more VRAM allows you to increase texture details without running into performance issues. A large bus would help but ultimately storing more data closer to the GPU cores in and of itself is extremely beneficial.

They are getting paid billions for doing the job they never finish. Why should I have empathy for thieves/underperformers?

I can assume you the vast majority of video game devs are not getting paid billions or even what their time is worth.

No all devs are "thieves / underperformers" like those at Larian Studio, Panthea, ect.

You are grouping all devs into a single box and using that as an excuse to say they haven't earned higher VRAM cards. It doesn't make any sense when you have a very literal example of a company, Nvidia, that is not giving us that VRAM to save $30 USD to fatten their 78% margins.

Well drop bombs onto AMD HQ then. Make them make the Greens sweat.

All depends on whether the software ecosystem starts supporting AMD cards more. Current AI software is mostly made for Nvidia cards, although support has been getting better over the last year.

Because we have both increasing amounts of VRAM and increasing amounts of not exactly well made titles.

If you look at VRAM size on video cards over the years at a given price point adjusted for inflation, you'd see that we are in the largest stagnation of VRAM size in the industry's history.

Quality falls decade by decade. How did their dads manage to produce state-of-the-art games with only 32 megabytes of VRAM to spare on a median consumer GPU? Why do they provide games that fail to look better than some average late 2000s title whilst also consuming a dozen gigabytes at FHD? Why not blaming them? NV are evil, true, but they are not the only ones inflicting excruciating pain upon the progress.

Games absolutely look better today than late 2000s titles. I still have a ton of classic games installed including Dragon Age Origins from 2009 and that game's graphics do not hold a candle to indie titles nowadays, let alone games like Alan Wake. You can tell the models are very low polygon as well and the interface and presentation are very dated. There's probably more polygons in a single character model today than an entire scene had back in 2009.

Quality of modern games is bad if you only purchase AAA games from big publishers like Ubisoft or EA but it that's your measure of the industry as a whole, it's both inaccurate and you'd deserve the qualtiy you get. There are plenty of good AA and Indie developers and more games are releasing today than any time in the past. People seem to miss so many good games by chasing AAA.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
You are grouping all devs into a single box and using that as an excuse to say they haven't earned higher VRAM cards. It doesn't make any sense when you have a very literal example of a company, Nvidia, that is not giving us that VRAM to save $30 USD to fatten their 78% margins.

Nvidia doesn't sell VRAM. How do they save $30? I don't think you understand the process of manufacturing a video card. Nvidia has TSMC manufacture the GPU and sells them to card manufacturers who buy the components and VRAM, assemble the card, ship it through distributors and then retailers to the end customer. Nvidia doesn't make anything from the VRAM.
 
Joined
Jan 2, 2024
Messages
623 (1.75/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
Nvidia doesn't make anything from the VRAM.
They make competition and enemies out of their VRAM choices.
All of them.
Best answer.

If it's good in pure raster, I'm guessing it's another FP64 monster and that's basically what I need out of a card.
When it's good in RT, it should be excellent in AI framegen and game specific framerates along with the frames needed in creator apps.
If it can do both of these effectively at a good price and with enough vram, it earns a sale.
The cheap "efficient" junk that's produced to keep old compact stuff alive gets my attention and that's all it should get.

The entire reason we're talking about this is because of weird schizms in the market between what gamers want and what the datacenter is willing to buy. One's a tight purse while the other is rapid firing blank cheques. Not the same.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
HWUB's video on the topic demonstrate that even with a small bus more VRAM allows you to increase texture details without running into performance issues. A large bus would help but ultimately storing more data closer to the GPU cores in and of itself is extremely beneficial.
Stop it. I told that VRAM per $ must be increased in my very first post. Still, this problem is seriously overblown: only a handful of games really are affected.
Current AI software is mostly made for Nvidia cards
Maybe, just MAYBE because AMD never supported AI (natively, at least) up until the latest gen and things didn't go well for RDNA3 in this department on a hardware side either? There's no CUDA support, no Windows drivers that just work for 99+ percent end users, no nothing. Do I need to remind that FSR is a joke, RT performance in AMD GPUs almost doesn't exist, and their professional workload capacity also's completely in the dark for a good half algos (at least as of '23)? AMD have money to solve the problem but they definitely lack cojones. Or something else.
If you look at VRAM size on video cards over the years at a given price point adjusted for inflation, you'd see that we are in the largest stagnation of VRAM size in the industry's history.
We're also in the biggest stagnation overall because of that pandemic, because of mining hysteria, because of corporate greed, yadda yadda.

Once more: I am okay with "only" getting 12 GB on a 4070 Ti level GPU IF a said GPU is under 5 Franklins (at launch, '22, not now). At 8 Benjies, it's a major problem. 4060, an 8 GB GPU, could've been sold for 190 bucks or so, also becoming a completely flawless GPU from the VRAM per $ standpoint. Some games don't fare well in terms of details? Well, that GPU cost you 190 dollars, what do you expect. For 300, you're rightful to have more than that.
 
Joined
Oct 30, 2022
Messages
239 (0.30/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
For me an iterative increase is what I want.

Pretend numbers to follow
rather than 30% more RT, 30% more raster or 30% better power usage, I'd like 10% more raster, 10% better RT and 10% better power consumption.

Alternatively a hopeful tier shift (as an AMD user) so a 4090 becomes a 5080, 7900XTX becomes an 8800XT

I know AMD could very well completely miss this mark but if Nvidia are still getting good generational uplifts then not all hope is lost (just most of it due to Nvidia's ability to charge kinda whatever they want, more so if AMD don't put up competition to a 5080)
 
Joined
Jul 1, 2011
Messages
364 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-12900KS 5.2GHZ All P-Cores ,4.2GHZ All E-Cores & Ring 4.2GhZ bus speed 100.27
Motherboard NZXT N5 Z690 Wi-Fi 6E
Cooling CoolerMaster ML240L V2 AIO with MX6
Memory 4x16 64GB DDR4 3600MHZ CL15-19-19-36-55 G.SKILL Trident Z NEO
Video Card(s) Nvidia ZOTAC RTX 3080 Ti Trinity + overclocked 100 core 1000 mem. Re-pasted MX6
Storage WD black 1GB Nvme OS + 1TB 970 Nvme Samsung & 4TB WD Blk 256MB cache 7200RPM
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510 Black with Cooler Master RGB Fans
Audio Device(s) Internal , EIFER speakers & EasySMX Wireless Gaming Headset
Power Supply Aurora R9 850Watts 80+ Gold, I Modded cables for it.
Mouse Onn RGB Gaming Mouse & Logitech G923 & shifter & E-Break Sim setup.
Keyboard GOFREETECH RGB Gaming Keyboard, & Xbox 1 X Controller & T-Flight Hotas Joystick
VR HMD Oculus Rift S
Software Windows 10 Home 22H2
Benchmark Scores https://www.youtube.com/user/matttttar/videos
I will vote for option # 7 Not Upgrading or Not Interested
 
Joined
Nov 11, 2016
Messages
3,458 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Can't believe RT performance has the highest vote count.

Let see if Nvidia can increase their RT perf by 2x, and AMD increase theirs by 3x for next-gen
 
Joined
Jul 26, 2024
Messages
195 (1.31/day)
RT performance uplift, but still, the cost will determine if it's worth replacing a 4070S. For rasterization only, the 4070S is gonna be fine for a good while.
 
Joined
Sep 8, 2020
Messages
219 (0.14/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
Prices can only go up and next gpu's whether they come from AMD or Nvidia will be very expensive.
I know people complain how expensive the pc parts are now but look at the price of energy and the money printing machine working 24/7 in US and EU, inflation is so high it's a miracle they are not double from before the "bug".
Midrange next gen gpu's at around 500$ will be a challenge for AMD and Nvidia, they could do it at 128 bit, severely cut die, things like that.
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Raw performance is everything, and featuresets that work everywhere will remain the only real development. Proprietary isn't gonna fly, even with 80% Nvidia marketshare or more. It'll last as long as it'll last, but it'll never carry an industry forward as a whole.
 
Joined
Sep 8, 2020
Messages
219 (0.14/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
When you have that kind of market share you are the industry and you bend everyone to your will, until the government gets involved and party is over.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Nvidia can increase their RT perf by 2x
Possible but extremely unlikely to happen due to business reasons. I'd say 5% more per dollar will already be a win.
AMD increase theirs by 3x
Perhaps 3 gens from now... Not sure if even more. We're now apparently having rinse'n'repeat of GCN when AMD launched HD 7970 four times in a row (7970 itself; 7970 GHz Edition, R9 280X, R9 380X). They already abandoned the high-end. Doesn't bring any hope they won't abandon RT development, either. Intel are more likely to catch up at this point despite colossal troubles they're facing at the moment.
look at the price of energy
Didn't actually go up if we talk money. Only ecology-wise. Still, NV can sell 4090s for as low as three hundred bucks and still be profitable. Just not as profitable as they are now. All these pandemics, shortages, "expensive" silicon nodes etc are just excuses.

Raw performance is everything
Becomes moot once something just doesn't run because your GPU lacks some cutting edge tech but "snails" made by your competition can actually make use of this software. Gorillas are much stronger than humans but they don't rule. Don't ask me why.
 
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Nvidia doesn't sell VRAM. How do they save $30? I don't think you understand the process of manufacturing a video card. Nvidia has TSMC manufacture the GPU and sells them to card manufacturers who buy the components and VRAM, assemble the card, ship it through distributors and then retailers to the end customer. Nvidia doesn't make anything from the VRAM.

No duh, AIB partners are the one's that add the physical memory to the cards but it's Nvidia that decides how much memory can be paired with the GPU in the first place. What I said is 100% correct, Nvidia is the one deciding how much VRAM you get, not board partners.

Nvidia is the one deciding the MSRP of these products and they do so well knowing the BOM costs that AIB partners will have. How does more VRAM factor into Nvidia's bottom line? Releasing an SKU with more VRAM could very well cut into how much Nvidia might from the sale of the GPU core itself if they decide that a GPU must hit a certain MSRP. In the case of a $700 GPU, Nvidia might lower the cost of the GPU core to the AIB a bit to fit the cost of the additional VRAM under a set price.

You can't call someone else ignorant but then have an argument premised on ignorance of the fact that Nvidia ultimately has control over the end product save for maybe the cooler (but even then Nvidia likely exercises control given all the 4000 series coolers are pretty beefy, most AIBs would cheap out). You are missing the forest throught the trees.

When you have that kind of market share you are the industry and you bend everyone to your will, until the government gets involved and party is over.

Yep, Bell Systems had an 85% marketshare when it was broken up. Nvidia has an 88% marketshare.

Stop it. I told that VRAM per $ must be increased in my very first post. Still, this problem is seriously overblown: only a handful of games really are affected.

You aren't wrong but your argument is analogous to the argument that 4 cores were enough for games back before Zen 1 came out and around the time Zen 1 launched.

It's true but ignores the fact that devs have no choice but to optimize for what's available. Of course only a few games are hindered, you first need to make the hardware available and affordable before devs will push to use it.

Maybe, just MAYBE because AMD never supported AI (natively, at least) up until the latest gen and things didn't go well for RDNA3 in this department on a hardware side either?

6000 and 7000 series support AI. So that's 1 less gen than Nvidia.

There's no CUDA support,

Well yeah, CUDA is Nvidia only. Impossible for AMD cards to support it unless they are using a translation layer like ROCm.

no Windows drivers that just work for 99+ percent end users

AMD drivers are pretty good as of late. Steve on HWUB even says he prefers AMD drivers.

That said support in AI applications could be better but that's more due to the aforementioned lack of CUDA support and Nvidia's 1 generation headstart.

, no nothing. Do I need to remind that FSR is a joke, RT performance in AMD GPUs almost doesn't exist, and their professional workload capacity also's completely in the dark for a good half algos (at least as of '23)? AMD have money to solve the problem but they definitely lack cojones. Or something else.

The rest of this is irrelevant to AI. Calling FSR a joke is definitely overblown as well. Some of this is a rehash of a prior point, we already know AMD lacks CUDA. CUDA IMO is anti-competitive given it completely removes competition from purchasing decisions. Most of the professional market hasn't had a choice in which vendor they should get, they have to get Nvidia.
 
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
At this stage i would say pricing is the most logical think. Raw raster performance without knowing the actual price doesn't say a lot. Better RT performance is always good but not as important as Raster Performance.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
The rest of this is irrelevant to AI.
It is, you just don't see how. More success in other fields = more user base = more opportunities to rule out how to treat AI workloads. It's just... AMD shoot wrong targets whenever they shoot and I gotta say they shoot way too rarely.
Calling FSR a joke is definitely overblown as well
Perhaps "a joke" is a little bit too rude but when DLSS2 from forever ago provides better image quality, more image stability and a smidge more performance than the latest FSR revision it's hard to find an appropriate term. Not to mention that NV's frame generation was better on the day 0 than FSR FG is right now, about a couple years later.
6000 and 7000 series support AI.
6000 series... I dunno. It's really poor, I'm talking from my 1st hand experience with a 6700 XT. Even RX 7600 does a significantly better job in this department. Everything seems working via some bridge, workaround, crutch of some sort. If AI was my main concern I'd never ever consider an AMD GPU, still.
AMD drivers are pretty good as of late.
Nah, should be reworked. Way too picky about stuff, I often get crashes because of how AMD software works with vsync. AMD drivers are more or less okay if you don't do anything "extraordinary" on your machine. Once you start experimenting you start cussing left, right, and centre because of crashes you can't even fix.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
No duh, AIB partners are the one's that add the physical memory to the cards but it's Nvidia that decides how much memory can be paired with the GPU in the first place. What I said is 100% correct, Nvidia is the one deciding how much VRAM you get, not board partners.

Nvidia is the one deciding the MSRP of these products and they do so well knowing the BOM costs that AIB partners will have. How does more VRAM factor into Nvidia's bottom line? Releasing an SKU with more VRAM could very well cut into how much Nvidia might from the sale of the GPU core itself if they decide that a GPU must hit a certain MSRP. In the case of a $700 GPU, Nvidia might lower the cost of the GPU core to the AIB a bit to fit the cost of the additional VRAM under a set price.

So Nvidia should change the specs to higher VRAM and lower the price of the GPU to allow card manufacturers to add more VRAM than is needed for no reason at all? And by doing that they would sell more GPUs? Do you really believe the things you are saying? For one thing Nvidia can only set MSRP. They can't strictly enforce it with AIBs obviously. Even if they could retailers would still set prices at whatever the market will bear no matter what anyone else does along the way. The reason that Nvidia and the manufacturers and the retailers are able to overcharge is because they offer cards that the vast majority want and they pay more for them than they should have to.

I'm hoping that will change some with AMD's next gen if they do increase RT performance as they have said but if that happens it will be quite funny all of the backtracking about how worthless the RT tech is by AMD fans.
 
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
So Nvidia should change the specs to higher VRAM and lower the price of the GPU to allow card manufacturers to add more VRAM than is needed for no reason at all? And by doing that they would sell more GPUs? Do you really believe the things you are saying?

They wouldn't necessarily have to lower the price of the GPU die itself although they easily could given their 78% margin. I gave you what's called an example, not the only way they could do it. Nvidia could and has provided rebates in the past towards MSRP. Again, not the only way of doing things. Just another option.

Saying there's no reason at all for additional VRAM is obviously a false statement, particularly in regards to 8GB cards. Even just in the gaming space there is immediately benefit, let alone other fields and the future.

I find it pretty wild that you think having more VRAM on 8GB cards is a crazy concept.

For one thing Nvidia can only set MSRP

This is patently false. Nvidia absolutely has a say over box design, cooler design, the BIOS (which has to be verified by Nvidia), clock limits, how much memory a card has, what density memory a card can use (by virtue of a GPU needing support for that), ect. Pretty much the only thing the AIB decides nowadays is the PCB design (but not in all cases) and the cooling solution (within Nvidia's guidelines). Things are very locked down nowadays.

FYI being able to set MSRP is huge.

Even if they could retailers would still set prices at whatever the market will bear no matter what anyone else does along the way. The reason that Nvidia and the manufacturers and the retailers are able to overcharge is because they offer cards that the vast majority want and they pay more for them than they should have to.

You are correct that cards are priced at what the market will bear but failed to mention the reason Nvidia and AIBs are able to overcharage is because they have markets outside of gaming that props up their profits. It's the same thing when Crypto was around, prices go up due to crypto demand and came back down after that demand plummeted.

It's not that gamers want to pay these prices and I guarantee you what the gaming market would bear is absolutely lower than what the current market actually demands.

I'm hoping that will change some with AMD's next gen if they do increase RT performance as they have said but if that happens it will be quite funny all of the backtracking about how worthless the RT tech is by AMD fans.

To be honest I'm not even sure AMD's next gen hardware is even relevant. Ultimately it's Nvidia software ecosystem lockin that's the problem and with Nvidia controlling the implementation of tech into the vast majority of games and already in control the AI and professional markets I don't see that changing unless either AMD and Intel team up, the government takes action, or AMD drastically cuts prices to be a riduclous bargain (which I doubt given their GPU division's incompetence).
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
They wouldn't necessarily have to lower the price of the GPU die itself although they easily could given their 78% margin. I gave you what's called an example, not the only way they could do it. Nvidia could and has provided rebates in the past towards MSRP. Again, not the only way of doing things. Just another option.

Saying there's no reason at all for additional VRAM is obviously a false statement, particularly in regards to 8GB cards. Even just in the gaming space there is immediately benefit, let alone other fields and the future.

I find it pretty wild that you think having more VRAM on 8GB cards is a crazy concept.



This is patently false. Nvidia absolutely has a say over box design, cooler design, the BIOS (which has to be verified by Nvidia), clock limits, how much memory a card has, what density memory a card can use (by virtue of a GPU needing support for that), ect. Pretty much the only thing the AIB decides nowadays is the PCB design (but not in all cases) and the cooling solution (within Nvidia's guidelines). Things are very locked down nowadays.

FYI being able to set MSRP is huge.



You are correct that cards are priced at what the market will bear but failed to mention the reason Nvidia and AIBs are able to overcharage is because they have markets outside of gaming that props up their profits. It's the same thing when Crypto was around, prices go up due to crypto demand and came back down after that demand plummeted.

It's not that gamers want to pay these prices and I guarantee you what the gaming market would bear is absolutely lower than what the current market actually demands.



To be honest I'm not even sure AMD's next gen hardware is even relevant. Ultimately it's Nvidia software ecosystem lockin that's the problem and with Nvidia controlling the implementation of tech into the vast majority of games and already in control the AI and professional markets I don't see that changing unless either AMD and Intel team up, the government takes action, or AMD drastically cuts prices to be a riduclous bargain (which I doubt given their GPU division's incompetence).

Take a look at the latest review of a game here on VRAM use which is just one more example of the 'need moar VRAM' hysteria being nonsense. An entry level gamer on 1080p with 8 GB isn't going to be using ultra settings to begin with and a midrange gamer is fine with 12 GB even on higher settings at greater 1440p.



When people argue opinions over facts most people go with facts and that would mostly explain Nvidia's dominance of the consumer market. Yes, a part of it is mindshare but there's nothing that AMD or Intel can do about that except compete by actually competing and not by slapping unnecessary gobs of VRAM on their specs. That is why their commitment to much improved RT performance in the next gen is smart.
 
Joined
May 7, 2023
Messages
680 (1.14/day)
Processor Ryzen 5700x
Motherboard Gigabyte Auros Elite AX V2
Cooling Thermalright Peerless Assassin SE White
Memory TeamGroup T-Force Delta RGB 32GB 3600Mhz
Video Card(s) PowerColor Red Dragon Rx 6800
Storage Fanxiang S660 1TB, Fanxiang S500 Pro 1TB, BraveEagle 240GB SSD, 2TB Seagate HDD
Case Corsair 4000D White
Power Supply Corsair RM750x SHIFT
I'm hoping that will change some with AMD's next gen if they do increase RT performance as they have said but if that happens it will be quite funny all of the backtracking about how worthless the RT tech is by AMD fans.
it's not that it's worthless, but it is not a neccessity to play games and enjoy high quality graphics, it's a nice to have, obviously if AMD improve it 2x or whatever over their current lineup then it will be nice to implement, though you confuse worthless with not necessary, it's a stick that NV use to beat AMD with when most AMD owners don't care about RT in games, at this point in time anyway. Heck my 6800 won't run any game with RT turned on as it CTD instantly :laugh: (started a thread about this months ago and never found out what the issue was :confused:) if it was implemented in some games without being able to toggle it off then it would be more of an issue though I can't see that happening as they would not only be limiting AMD customers but lower end spec gamers as well
 
Top