• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen GPUs: What Matters Most to You?

Next-Gen GPUs: What Matters Most to You?

  • Raster Performance

    Votes: 6,487 27.0%
  • RT Performance

    Votes: 2,490 10.4%
  • Energy Efficiency

    Votes: 3,971 16.5%
  • Upscaling & Frame Gen

    Votes: 662 2.8%
  • VRAM

    Votes: 1,742 7.3%
  • Pricing

    Votes: 8,667 36.1%

  • Total voters
    24,019
  • Poll closed .
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I'm also speaking from a little dated experience. That was the latest verison... as of Dec'23. However, on my 27" 4K display, I failed to realise why should I use native instead of DLSS P. Image quality was just a smidge worse but I had like 90 percent more FPS.
That's because you're on 27" 4K. The situation is way different when you're on something like a 24" 1080p, which is more like what budget gamers use.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Raster performance leading, yet no one buys AMD...

Better raster performance is the most important for anyone gaming whether on Nvidia, AMD or Intel GPUs. Even games that use RT only do so with a mixture of RT and rasterization. RT is only implemented in small ways for now and in the near future. Also gamers buying entry level GPUs aren't interested in RT to begin with but better raster performance definitely is a plus. The vast majority of gamers buy entry level through midrange.
 
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
Raster performance leading, yet no one buys AMD...
People arn't to smart that's a big problem right here.... pricing only 14% voters almost no one cares about next gen prices. :) So nVIDIA can easily put higher prices again and there will be no resistance.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
People arn't to smart that's a big problem right here.... pricing only 14% voters almost no one cares about next gen prices. :) So nVIDIA can easily put higher prices again and there will be no resistance.
A bad economy makes people poorer. Being poor makes people depressed. To compensate depression, people spend more on luxuries like alcohol, cigarettes, or GPUs. It's psychological.

Besides, we're a tech forum. You can't fault people for having a hobby, however expensive it may be. :)
 
Joined
Jun 22, 2012
Messages
302 (0.07/day)
Processor Intel i7-12700K
Motherboard MSI PRO Z690-A WIFI
Cooling Noctua NH-D15S
Memory Corsair Vengeance 4x16 GB (64GB) DDR4-3600 C18
Video Card(s) MSI GeForce RTX 3090 GAMING X TRIO 24G
Storage Samsung 980 Pro 1TB, SK hynix Platinum P41 2TB
Case Fractal Define C
Power Supply Corsair RM850x
Mouse Logitech G203
Software openSUSE Tumbleweed
In any case, thinking at the poll choice selection at a general level:

Energy efficiency: Guaranteed to improve for the same performance level, so it's a moot point.
Performance: Guaranteed to be the same or better.
Upscaling/Frame gen: Tied to general performance improvements and software support.
VRAM: Tied to what NVidia will allow users to have in order not to cannibalize their datacenter lineup (many at a professional level—think universities or small startups—ended up using 3090/4090 and sometimes 4080 for their compute+VRAM needs). Depends on commercial factors.
Pricing: Of course, it's going to be the same or more expensive if the GPUs are better in some capacity. Depends on commercial factors.

I wouldn't say that the poll is poorly designed, but the only things that truly matter if you think about it are VRAM and pricing, because the other points are going to have the usual incremental technological improvements that the professional market already needs anyway. Efficiency in particular is really important given how much power datacenters need, but it shouldn't be misunderstood for "maximum power required", because if the useful work per Joule improves, then the GPUs will be more efficient (and we also already know that for gaming cards NVidia and AiB partners tend to milk the last 5~10% of performance for 50% higher power consumption or so).
 
Joined
Apr 22, 2024
Messages
192 (0.79/day)
System Name Main Workstation
Processor AMD Ryzen Threadripper 2970WX PBO 3,6-4,2Ghz
Motherboard Gigabyte x399 Aorus Pro (ReBAR patched)
Cooling Alphacool Eisbaer Aurora Pro 420
Memory 8x16GB (128) Kingston Fury RGB @3266Mts 16-18-18-36-74 2T
Video Card(s) RTX 3090 ROG Gaming OC @1500Mhz (1965Boost)
Storage Lexar NM790 2TB, Corsair Force MP510 960Gb, Samsung 860 SATA 2TB, 2TB WD Green 7200rpm
Case Phanteks Ethoo Pro 2 TG
Power Supply EVGA Super Nova 1000GT
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
What matters to me is that ngreedia would not scam potential buyers of the next-gen cards with cut down eunuchs instead of normal gpu's. I bought rtx 4070ti for 800$ equiv in my country, which is twice the median month salary around here, and got myself a card that will probably die in most new games couple of years down the line because this asian schmuck decided to put 12gigs on 192 bus in it. And then, same sucker less than a year later launches a normal card with sufficent 16 gigs and 256 bit bus, calls it a super. But i don't have another 800-1000 bucks, so shame on me.
But what the hell am i talking about? Of cource they will scam us every gen. Because they will never forgive themselves for Pascal architecture and how they failed to line up their pockets because for most of the gamers 10-series was sufficent to play any game up until ue5 maybe.
You always have a choice not to buy anything you don't consider worth buying.
 
Joined
Dec 12, 2016
Messages
1,948 (0.66/day)
This poll is moving all over the place. At the current 16k votes, RT and SS/FG together are at 10%. This is a more reasonable result. The vast majority of no to some knowledge buyers do not care or have any idea about these features. I am still surprised pricing is not getting over 50% of the vote.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
All of them.
This right here, why can't I want improved Raster, RT, VRAM, efficiency and Upscaling all at once? I'll be chasing an improvement in all areas when I upgrade from a 3080.
featuresets that work everywhere will remain the only real development. Proprietary isn't gonna fly, even with 80% Nvidia marketshare or more. It'll last as long as it'll last, but it'll never carry an industry forward as a whole.
Hard disagree, I'd say that Nvidia has moved the industry forward as a whole technologically by pioneering RTRT in games and ML Upscaling. Even if/when 5-10-15+ years from now their proprietary feature set isn't the enduring standard, we wouldn't have any open/enduring standards if it wasn't for them pushing the bar forward, or at the very best, we'd have them later.

It's also amusing to see upscaling continue to be called a gimmick (hint - it's categorically not), but I can see why people that can't/don't use DLSS might say that, I've certainly enabled FSR and laughed at the results before.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
This right here, why can't I want improved Raster, RT, VRAM, efficiency and Upscaling all at once? I'll be chasing an improvement in all areas when I upgrade from a 3080.
I agree with you there...

Hard disagree, I'd say that Nvidia has moved the industry forward...
...but hard disagree there. The only thing Nvidia moved forward is their brand recognition and their wallet. Proprietary technologies never help any industry as a whole, only the company that makes them.

It's also amusing to see upscaling continue to be called a gimmick (hint - it's categorically not), but I can see why people that can't/don't use DLSS might say that, I've certainly enabled FSR and laughed at the results before.
It depends on where you're coming from. If you have a 4K TV that you're sitting 3 metres away from, and you've got a relatively low-end HTPC attached to it, then it's great. But if you have a high-end PC and you only game at 1440p or below, then chances are you don't need it.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
hard disagree there.
Agree to disagree then? the way I see it, Nvidia has absolutely moved the industry forward in at least two areas since they launched RTX (feature set), but I can accept not everyone agrees with that, despite it seeming obvious to me.
It depends on where you're coming from. If you have a 4K TV that you're sitting 3 metres away from, and you've got a relatively low-end HTPC attached to it, then it's great. But if you have a high-end PC and you only game at 1440p or below, then chances are you don't need it.
So if it has great use cases even in your books, wouldn't that make it not a gimmick? I see it as a very useful feature with broad spanning utility. And just like optimized settings, it's another tool in the box to tweak IQ and FPS to the taste of the user.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Agree to disagree then? the way I see it, Nvidia has absolutely moved the industry forward in at least two areas since they launched RTX (feature set), but I can accept not everyone agrees with that, despite it seeming obvious to me.
Ok, let's settle with that. You say Nvidia moved the industry forward by developing technologies that AMD and Intel also made equivalents of. I say, they only really helped their own pockets by developing a proprietary technology that locked the competition out of the game (DLSS), and put them in a disadvantage. We may both be right in our own ways.

So if it has great use cases even in your books, wouldn't that make it not a gimmick? I see it as a very useful feature with broad spanning utility. And just like optimized settings, it's another tool in the box to tweak IQ and FPS to the taste of the user.
Yeah, it's got some great uses. I finished Hogwarts Legacy on a 6500 XT thanks to FSR. It wasn't pretty, but it was certainly usable. I just wish some developers didn't use upscaling as an excuse to shove mediocre graphics down our throats that only runs great on a 4090 without it.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Nvidia moved the industry forward by developing technologies that AMD and Intel also made equivalents of.
Which happened first though? We'll never know the result of this next hypothetical question, but would AMD or Intel have pursued RTRT or GPU Upscaling if Nvidia didn't trailblaze them both? Certainly feels reactionary from the both of them, but hey they might have gotten there themselves, eventually, we'll never know. Chicken and egg really, but I believe we have NVidia to thank for RTRT in games (and on consoles), and for FSR and XeSS's existence. Given where we are today, I'd even say DLSS is to thank for FSR and XeSS not being locked out of games.
I just wish some developers didn't use upscaling as an excuse to shove mediocre graphics down our throats that only runs great on a 4090 without it.
I also wish some developers did better and did need to count on upscaling to get poorly performing games running at acceptable framerates, it should remain a bonus to improve performance, not a requirement for bare minimum FPS levels.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Which happened first though? We'll never know the result of this next hypothetical question, but would AMD or Intel have pursued RTRT or GPU Upscaling if Nvidia didn't trailblaze them both? Certainly feels reactionary from the both of them, but hey they might have gotten there themselves, eventually, we'll never know. Chicken and egg really, but I believe we have NVidia to thank for RTRT in games (and on consoles), and for FSR and XeSS's existence. Given where we are today, I'd even say DLSS is to thank for FSR and XeSS not being locked out of games.
Like I said, it depends on which way you look at it. You can say that Nvidia created something revolutionary with RTRT and DLSS that AMD and Intel followed with their own standards because they also wanted a piece of the cake. The way I look at it, though, is Nvidia created DLSS and made it fully dependent on their in-house designed tensor cores to lock the competition out of the game and gain an unfair advantage. They're a for-profit company, so I don't have any illusions of them having the slightest intention to move anything forward other than their own bank accounts. If both AMD and Intel succeeded in creating versions of their own standards that run on any hardware, then what prevented Nvidia from doing so other than greed?

I also wish some developers did better and did need to count on upscaling to get poorly performing games running at acceptable framerates, it should remain a bonus to improve performance, not a requirement for bare minimum FPS levels.
Exactly my thoughts. Whenever a developer showcases a game saying "look how great it runs with DLSS Q", I just think "show it without DLSS first, and only then I'll decide whether upscaling is necessary or not".
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
If both AMD and Intel succeeded in creating versions of their own standards that run on any hardware, then what prevented Nvidia from doing so other than greed?
This.

Similar things occurred with FreeSync except there Nvidia clearly lost the battle because its much easier to 'finish' the development on that feature. But this historically has been happening every time. New technology simply becomes possible and various companies will chase it. Someone will always be first with it, but the industry only moves forward once it becomes a standard feature.

Right now, upscaling is part vendor agnostic and DLSS 'unique selling point' space is rapidly dwindling, as per the Gsync / Freesync situation. We will arrive at a point where its totally irrelevant. Already it is using constant updates and options to expand the feature to keep it 'unique' - and Nvidia directly makes you pay for them by not offering a lot of it for past gen cards. Now of course: they can. Simply because DLSS now, still commands unique selling points. I think there's no denying that. The question is whether you should or want to pay for them, because you do.

Its not difficult to see the similarities here, they are striking and time will prove this point. Nvidia isn't a magician, they just pre-empt industry developments and make you pay for it. VRR would have come regardless of Nvidia; RT much the same; look at Nanite, or Crytek's implementation in Neon Noir. There is no question in my mind all of that would have been created regardless of whether Nvidia exists. Rather, I think that's incredibly naive and narrow-minded. Humans simply arrive at certain development stages in all things. Its like the invention of fire or writing - different cultures in totally disconnected places discovered those things, without 'stealing' any ideas from each other. Developments simply become logical at some point in time. Pre-empting those can be a strategy to corner the market, but it will never last. You can draw that parallel in almost everything: the Ipod, the Iphone... electric vehicles... the internet... first vs second and third world countries... Facebook.

Even RT wasn't new when Nvidia introduced it to the real time GPU processing arena. They just accelerated it differently. That's all RTX was, and still is.

we'd have them later.
Agreed. A lot of things chicken/egg have been moved to vibrant industry thanks to Nvidia, at least 'faster' than without them. The underlying question though is whether any gamer really benefits, I think that really depends on your perspective. Because part of the move to RTX has been 'less hardware for your money' because you're part buying featuresets now. Nvidia wants to be a software company, so you're paying for software. It doesn't really give you better games. Au contraire even - apart from a few poster childs, the overall quality of graphics showoff AAA has been abysmal the last five to seven years.

If you are primarily interested in the prettiest pictures in your gaming, and you pixel peep a lot and are truly interested in the tech development, I think Nvidia has something on offer. But if you're primarily interested in gaming for its mechanics, its gameplay, the game itself, whatever it has graphically being secondary... Nvidia's move has just made your gaming more expensive. Substantially, I might add.
 
Last edited:
Joined
Apr 16, 2010
Messages
3,609 (0.67/day)
Location
Portugal
System Name LenovoⓇ ThinkPad™ T430
Processor IntelⓇ Core™ i5-3210M processor (2 cores, 2.50GHz, 3MB cache), Intel Turbo Boost™ 2.0 (3.10GHz), HT™
Motherboard Lenovo 2344 (Mobile Intel QM77 Express Chipset)
Cooling Single-pipe heatsink + Delta fan
Memory 2x 8GB KingstonⓇ HyperX™ Impact 2133MHz DDR3L SO-DIMM
Video Card(s) Intel HD Graphics™ 4000 (GPU clk: 1100MHz, vRAM clk: 1066MHz)
Storage SamsungⓇ 860 EVO mSATA (250GB) + 850 EVO (500GB) SATA
Display(s) 14.0" (355mm) HD (1366x768) color, anti-glare, LED backlight, 200 nits, 16:9 aspect ratio, 300:1 co
Case ThinkPad Roll Cage (one-piece magnesium frame)
Audio Device(s) HD Audio, RealtekⓇ ALC3202 codec, DolbyⓇ Advanced Audio™ v2 / stereo speakers, 1W x 2
Power Supply ThinkPad 65W AC Adapter + ThinkPad Battery 70++ (9-cell)
Mouse TrackPointⓇ pointing device + UltraNav™, wide touchpad below keyboard + ThinkLight™
Keyboard 6-row, 84-key, ThinkVantage button, spill-resistant, multimedia Fn keys, LED backlight (PT Layout)
Software MicrosoftⓇ WindowsⓇ 10 x86-64 (22H2)
Surprised 'Pricing' isn't in the lead, when the global economy is edging a recession, unless the most common priorities have shifted and now it does not matter how much a GPU costs, people (in general) will buy it.
Honestly, for 3 decades (until 2017, thanks bitcoin) it was about price/perf. ratio that dictated the winner, at the various market segments.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Surprised 'Pricing' isn't in the lead, when the global economy is edging a recession, unless the most common priorities have shifted and now it does not matter how much a GPU costs, people (in general) will buy it.
Honestly, for 3 decades (until 2017, thanks bitcoin) it was about price/perf. ratio that dictated the winner, at the various market segments.
I agree, but I think it's because:
A bad economy makes people poorer. Being poor makes people depressed. To compensate depression, people spend more on luxuries like alcohol, cigarettes, or GPUs. It's psychological.

Besides, we're a tech forum. You can't fault people for having a hobby, however expensive it may be. :)
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Like I said, it depends on which way you look at it.
Always.
You can say that Nvidia created something revolutionary with RTRT and DLSS that AMD and Intel followed with their own standards because they also wanted a piece of the cake. The way I look at it, though, is Nvidia created DLSS and made it fully dependent on their in-house designed tensor cores to lock the competition out of the game and gain an unfair advantage.
I don't see it through such a negative lens at all. Nvidia created a product / feature that didn't exist, and in the world we live in of corporations, you'd be downright foolish to not at least try to capitalise on that while you can if the market conditions and your position within it allows it.
They're a for-profit company, so I don't have any illusions of them having the slightest intention to move anything forward other than their own bank accounts. If both AMD and Intel succeeded in creating versions of their own standards that run on any hardware, then what prevented Nvidia from doing so other than greed?
AMD and Intel are for profit too last time I checked, and I'd wager heavily on them trying to capitalise on pioneering a new to market feature / product of the same nature, if they were in an equally market dominant position. Be it monetarily or another ploy to have you point your wallet at them instead of the competition. They're all greedy, and if anyone thinks they're not I've got a bridge to sell them, I've certainly seen enough from both AMD and Intel to believe this to be true.
I'm glad you agree that "Nvidia has moved the industry forward as a whole technologically by pioneering RTRT in games and ML Upscaling. Even if/when 5-10-15+ years from now their proprietary feature set isn't the enduring standard, we wouldn't have any open/enduring standards if it wasn't for them pushing the bar forward, or at the very best, we'd have them later."
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I don't see it through such a negative lens at all. Nvidia created a product / feature that didn't exist, and in the world we live in of corporations, you'd be downright foolish to not at least try to capitalise on that while you can if the market conditions and your position within it allows it.
Sure, from Nvidia's point of view, this is an entirely positive thing. They created something hugely popular that the competition will never even have a chance to access because it requires their own hardware to work. If I was Mr Jensen, I'd be laughing all day and night. But the thing is that I'm not. I'm just an ordinary home user, and I benefit far more from open standards than from proprietary tech that limits my buying choice to one single brand.

AMD and Intel are for profit too last time I checked, and I'd wager heavily on them trying to capitalise on pioneering a new to market feature / product of the same nature, if they were in an equally market dominant position. Be it monetarily or another ploy to have you point your wallet at them instead of the competition. They're all greedy, and if anyone thinks they're not I've got a bridge to sell them, I've certainly seen enough from both AMD and Intel to believe this to be true.
I didn't say that was not the case (even though I have never seen a single proprietary feature on an AMD GPU besides TrueAudio, which never gained any traction, but that's besides the point). I'm not defending any company here. But like I said, I, a home user, have to decide what's best for me (not for them), and vote with my wallet. Open standards promote freedom of choice, which is exactly what I want. Relying on closed features limits your choice to a single brand, the acceptance of which is the precursor to a monopoly. We're already seeing signs of what such a monopolistic situation brings with itself, in terms of GPU prices relative to their value.
 
Joined
Dec 12, 2016
Messages
1,948 (0.66/day)
Holy crap, the number of votes just jumped to 24k and now energy efficiency is over 50%! RT continues to drop. Crazy poll results. If this is really the majority preference, the 600W 5090 is going to go over like a lead balloon.
 
Joined
Apr 2, 2011
Messages
2,849 (0.57/day)
What's to be confused about. When someone posts that barely anyone has heard about RT or wants it that's pretty clear what they are saying and then went on to make the accusation that the poll is being spammed with RT votes. That's why I posted what I did several comments back before what you are replying to.

The bottom line is that you don't have to use RT so there is not a stick for Nvidia to beat you on the head with to begin with and it is ok for a lot of gamers to want more RT progress as well.

The vote was spammed, it was raster and price a good margin ahead of RT and within the space of 12+- hours there were thousands of votes for RT, it is a stick for Nvidia just the same as all their proprietry technologies that AMD is behind on to justify the ridiculous prices they are charging for GPU's these days, honestly £$1500-2000 for a top end card and 1200 for the next rung down, is extortionate, and people can mention inflation this and covid, energy prices etc, they are making more money now per SKU than they ever had by a huge margin, it's just a case of "tough shit, this is the price, don't like don't buy" mentality, don't get me wrong AMD are just as greedy and price theirs accordingly, so we don't have x,y,z features but are close enough in raster so we will just go $100-$200 cheaper, and yes you may have had previous comments alluding to this, I didn't quote them, I quoted one and replied to it, we don't have to go back through everything you have ever posted for you to make your point, and my previous reply to you is still valid, IMO.

You...do realize that this is a website for nerds? Nerds who basically all want to demonstrate that they have the best of the best, and one means to do so is by having the latest game and the latest bleeding edge tech.

I ask and answer this because it's like going to a bacon lover conference, and asking whether there's potentially an overblown link between eating bacon and higher cholesterol. As in your sampling pool is extremely biased...and one of the things which currently differentiates AMD from Nvidia is RT performance. So instead of measuring a thing that both cards can do (Raster, or price), you artificially remove discussion of the opposition (and demonstrate you "made the right choice") by choosing a metric which it is not competing in.


This accounts for "spamming the vote." It accounts for most discrepancies between "have" and "have not" logic. It also removes the consideration that people are not adequately representing the statistical spread of purchased components...given that from point zero we know that the pool of respondents is more likely than not Nvidia purchasers, purchasers of middle to high end cards, and willing to jump through hoops to test new tech.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
@AusWolf I love open standards too, even more so, but I'm also OK with the path to them potentially being paved with a innovative, trailblazing closed standard. From my chair, we're all benefitting in the long term from that happening in this space in particular in the recent past. Now if an open standard can topple DLSS (or even just cure the artefacts that ruin my immersion the most) I will absolutely cheer for it and welcome it with open arms, and I have used FSR on non RTX hardware to positive effect too even in it's current state, hell AMD's FG has completely mooted Nvidia FG as a selling point feature to me, yet we wouldn't have it without DLSSFG.

And with that perhaps it seems like we're right back to agree to disagree, yet we both open this can of worms relatively often don't we :peace::roll:
 
Joined
Jan 1, 2019
Messages
512 (0.23/day)
Given I use laptops, discrete GPU is more interesting. My Dell XPS is an older machine with a GTX 1050 Ti 4GB. Machine has a 4K LCD so its more interesting to see what works at 3840x2160 resolution.
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm glad you agree that "Nvidia has moved the industry forward as a whole technologically by pioneering RTRT in games and ML Upscaling. Even if/when 5-10-15+ years from now their proprietary feature set isn't the enduring standard, we wouldn't have any open/enduring standards if it wasn't for them pushing the bar forward, or at the very best, we'd have them later."
I only agree that we would have them later. Look at Intel XeSS, they would have gone through with that anyway, and the DX12 Ultimate standard was also a unified box of tricks already. Similarly, AMD just needs FSR on consoles. That's not Nvidia's achievement in any possible way - in fact Nvidia told us the lie that it would be impossible to upscale on anything pre Turing. The only thing Nvidia achieved is pushing its own accelerator hardware for it. They were nor are in it to make gaming better, they're in it to sell GPUs, whatever games come out of it is secondary. And it shows - not a single game is truly influenced by RT in any way, mechanically or in gameplay. Nothing's really added in innovative power for gaming. Its just prettier picture at immense perf cost - again something Nvidia is keen to promote. So I guess that's a bar you could say has moved forward. Gaming got more expensive. Yay.

Even Nvidia's upscale is strategically placed to make sure you don't really gain anything from it, you'll still upgrade, because now you need iteration 3, or 4. The industry isn't gaining anything from it, again, on the contrary, because developers are now forced to implement up to 3 upscalers, and there is more segmentation in hardware capabilities on top of it, the latter only thanks to Nvidia. Nvidia also takes dev time away from the game itself and diverts more to the RT implementation in it, even if it contributes to that as well. And all that for historically extremely minor graphical changes, whether its different grass or hair physics Hair/TurfFX or RT, its all more of the same, and none of it really stuck just yet. And if it DOES stick, they're ready to axe it themselves at any point it doesn't earn money: see PhysX.

But let's look back a bit further, for relative comparisons, at what AMD has done on the API front. Now that's a real industry push. They enabled and accelerated the adoption of the APIs we have now, and those APIs have directly contributed to more complex games, better threading, fixing the CPU issue for a large part in gaming performance. In turn, those APIs even enabled RTX. And us consumers? We just stood there and watched it happen. GPUs didnt get more expensive for it, they actually just got a lot better at gaming overall and suffered less bottlenecking; work more smoothly overall on any desktop system, we gained tiled rendering and various other improvements out of it. Real ones, that enable far more complex environments that we now see in every game and engine.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I only agree that we would have them later.
No worries, I'm saying later that was an optimistic best case scenario given what we know, so we still got some open features sooner, but I'll leave it there for that topic. As for the rest I have varying levels of agreement and disagreement but I think I've hogged thread space enough making my point.

Very much looking forward to what the next 6-9 months bring, and there's no way I'm spending $1500+ USD / $2500+ AUD on a video card so I'm very interested in the RDNA4 vs midrange Blackwell battle. $700 USD tops, so here's hoping for that upgrade in all key areas, which I think will be realistic in that segment.
 
Top