• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hypothetical questions involving AMD Rumors and Nvidia's featureset

Nvidia's featureset is worth how much more than AMD assuming performance is the same?

  • Nvidia's featureset is worth 5% more

  • Nvidia's featureset is worth 10% more

  • Nvidia's featureset is worth 15% more

  • Nvidia's featureset is worth 20% more

  • Nvidia's featureset is worth 25% more

  • Nvidia's featureset is worth greater than 25% more


Results are only viewable after voting.
Joined
Jul 20, 2020
Messages
839 (0.60/day)
System Name Gamey #1 / #2
Processor Ryzen 7 5800X3D / Core i7-9700F
Motherboard Asrock B450M P4 / Asrock B360M P4
Cooling IDCool SE-226-XT / CM Hyper 212
Memory 32GB 3200 CL16 / 32GB 2666 CL14
Video Card(s) PC 6800 XT / Soyo RTX 2060 Super
Storage 4TB Team MP34 / 512G Tosh RD400+2TB WD3Dblu
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / CM N200
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / Corsair CX550M
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
I'll add one more specific but potentially widely applicable thing as a value difference between Nvidia and AMD: Minecraft with shaders. Minecraft is rather a widely played game with a quarter billion copies sold over all platforms so this is a reasonable consideration.

Native Minecraft or MC with regular performance mods plays very well with both brands' GPUs. But modded with shaders sucks toads on AMD cards. Man I hate to say that but I think all the devs use Nvidia cards as they work very well on Nvidia GPUs, with 100% GPU utilization and good FPS. But the utilization on AMD GPUs is often 25-50% to where my 2060 Super dukes it out and even beats the 6800 XT.

Now the 2060S is a sleeper awesome GPU (especially with DLSS) but the 6800 XT should soundly beat it in everything, but not here. Note this is NOT Minecraft RTX, just regular modded-in shaders like most 3D games. I hadn't played MC on an Nvidia GPU in a couple of years before popping the 2060S in, and getting 3X the FPS I expected was.....

Pleasant.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,420 (1.90/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
I'll add one more specific but potentially widely applicable thing as a value difference between Nvidia and AMD: Minecraft with shaders. Minecraft is rather a widely played game with a quarter billion copies sold over all platforms so this is a reasonable consideration.

Native Minecraft or MC with regular performance mods plays very well with both brands' GPUs. But modded with shaders sucks toads on AMD cards. Man I hate to say that but I think all the devs use Nvidia cards as they work very well on Nvidia GPUs, with 100% GPU utilization and good FPS. But the utilization on AMD GPUs is often 25-50% to where my 2060 Super dukes it out and even beats the 6800 XT.

Now the 2060S is a sleeper awesome GPU (especially with DLSS) but the 6800 XT should soundly beat it in everything, but not here. Note this is NOT Minecraft RTX, just regular modded-in shaders like most 3D games. I hadn't played MC on an Nvidia GPU in a couple of years before popping the 2060S in, and getting 3X the FPS I expected was.....

Pleasant.
I haven't played MC in a while, is the RT edition still a sidegrade since it has no mod support?

I remember being blown away when Minecraft RT released, feels like version 2.0 of the game.

 
Joined
Jul 20, 2020
Messages
839 (0.60/day)
System Name Gamey #1 / #2
Processor Ryzen 7 5800X3D / Core i7-9700F
Motherboard Asrock B450M P4 / Asrock B360M P4
Cooling IDCool SE-226-XT / CM Hyper 212
Memory 32GB 3200 CL16 / 32GB 2666 CL14
Video Card(s) PC 6800 XT / Soyo RTX 2060 Super
Storage 4TB Team MP34 / 512G Tosh RD400+2TB WD3Dblu
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / CM N200
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / Corsair CX550M
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
I haven't played MC in a while, is the RT edition still a sidegrade since it has no mod support?

I remember being blown away when Minecraft RT released, feels like version 2.0 of the game.


Minecraft RTX is based on the Bedrock codebase for Minecraft (cross-platform with Win10/11, consoles, and mobile OSes) and as such is non-moddable. Maybe someday it will but as I understand it, code efficiency and multidevice interoperability is the point of Bedrock, not code flexibility. So Minecraft RTX is stuck being unmoddable.

Which is fine as I prefer MC Java for it's ridiculous flexibility, which comes at the expense of old, twisted, crappy code which itself has mods to mitigate much of that. Lol, but not fix. There are straight up RT mods for Java which I haven't tried yet as frankly, a good set of shaders IMO look better than the RT implementations I've see so far. That will change but shaders have had quite the head start.
 
Joined
Feb 24, 2023
Messages
2,254 (5.11/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
• DLSS: meaning in games where it's available I'm getting give or take same image quality on Balanced if compared to FSR Quality. Often better but not always. Contributes to ~15 percent free performance in a limited number of games. DLAA makes gaming at "native" much more compelling on green GPUs. Ten percent premium over AMD.
• Better RT support: meaning in games like Alan Wake 2 or Cyberpunk 2077 I would get much more all-rounded experience. At 400+ USD and at 7900 XTX level of performance, you expect enabling at least some RT. Deserves about five to ten percent additional premium.
• Better professional software support. Easily quantifies to another five to ten percent premium being reasonable.

So, I would've gone no-brainerly NV in case 4080S being no more than 125% the price. 126 to 135 % is "it's probably still worth it" territory. For 136 percent or more, NV GPUs will be overpriced. FSR 3.1 is 1.5 years too late.
 
Joined
Sep 27, 2008
Messages
1,038 (0.18/day)
At present? I'd be willing to pay up to 10% more for Nvidia's perks if I was looking for a new card.
 
Joined
Jun 27, 2011
Messages
6,692 (1.42/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
Minecraft
I'd like to point out that TPU tests with Ray Tracing - which means high settings, but not Path Tracing or "Full Ray Tracing" as NVIDIA calls it.

The point i'm trying to make here, is that as you increase the load on the ray tracing hardware of these cards, the NVIDIA cards become faster, relatively, to their AMD equivalents. This is important to understand, because some popular games include very lightweight or basic implementations of "ray tracing", such as global illumination only, for example. This skews the data slightly, because the "ray tracing" performance penalty is much smaller, than if the entire game's lighting was ray or path traced, instead of a hybrid design of rasterised lighting and ray/path traced lighting.
I am very aware of that. I thought I made that clear already in the thread but now you know. We have actually discussed exactly that in other threads. I chose to say "The 4080s has ~20% more RT performance than the 7900xtx according to TechPowerUP reviews" rather than reference path tracing because so few games use it. Are there even more than 10 games that use path tracing so far? The handful of people who play those games should understand their performance needs and that should reflect upon their purchasing decisions. Someone who plays a lot of path tracing games would be willing to pay a significantly higher price for Nvidia Hardware because it is simply worth it for them.
Obviously 4K path traced gaming isn't currently viable at native, without using some form of upscaling/tech to improve FPS, or any combination of performance/quality improving tech such as DLSS/DLAA, Frame Generation and Ray Reconstruction. ...when these cards are actually stressed with intensive ray/path tracing implementations.

I wanted to write this because I don't think people (especially people who don't own an RTX card, or even those who haven't tried a higher end Ada generation card) really understand the difference in performance between the two vendors, and just how far ahead NVIDIA is.
Path tracing performance, up scaling, frame generation are all within Nvidia's feature set and are something people should consider in their purchasing decisions.

With the release of the PS5 Pro and the eventual Xbox Series refresh, the PS5 Pro is rumoured to have significantly faster ray tracing performance, meaning developers will probably start using heavier RT/PT implementations. But I doubt we'll see widespread path tracing until the next generation of consoles are released, e.g. PS6.

As developers start to actually make full use of the new lighting techniques of the latest game engines moving forward, I expect this trend will really start to show the differences in performance more completely, and game performance testing will show numbers skewing closer and closer to what's been shown here. Path traced lighting, e.g. no rasterized lighting at all, is the obvious end game.

This is what I'm getting at, most games today don't come close to fully using the ray tracing hardware on current generation cards, so even with "ray tracing" turned on, the FPS is still dictated by classic rasterization performance. This will change, as games use heavier and heavier RT, or full RT/PT implementations.
I cannot speak for everyone. There are obviously those who are currently enjoying ray tracing games. I personally do not care about most of Nvidia's feature set.

I do not play path tracing games. I have only ever played one ray tracing game and my meager 6750xt gets adequate performance. As pretty as ray tracing and path tracing are, I do not value it right now. Too few games use it. Even fewer are games I am likely to play.
DLSS of any version is great but not necessary to me. Too few games use it. Even fewer are games I am likely to play.
Nvenc is amazing. I enjoyed it with my gtx 1060. I don't often record gameplay so that feature goes unused with me.
CUDA and in general compute features and performance is wasted in me. I haven't done anything with compute since 2014 if I ever did.
The ai hdr feature interests and reflex are the two features that interest me most but I can easily live without. I would rather spend less money.
I am probably forgetting about features that is how little they matter to me right now.

Maybe one day xx60 class gpu has above 5090 performance and games I actually play heavily use ray tracing I will care. That is me though. That is why I only value nvidia's feature set 5 to 10% higher than AMD. Other people have different priorities and will value it differently.
I remember being blown away when Minecraft RT released, feels like version 2.0 of the game.
I think Minecraft RT is one of the best showcases of ray tracing. Minecraft does not have fancy graphics. Everything is blocky and low res by default. When ray tracing is the only thing adding visual interest it really shows how big a difference ray tracing can make.
 
Joined
Jun 25, 2020
Messages
97 (0.07/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 3x SilentWings3 120mm @Front, Noctua S12A @Back
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200 1GB, Crucial P3+ 4TB (w/adaptor, @Gen2x1 ), Seagate 3TB+1TB HDD, Kingston SA400 512GB
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Non-branded wired full custom mechanical keyboard from my brother's leftover
I utterly don't care about any NVIDIA features, so I voted 5%.
My brother has more preference to high FPS, is more into ray tracing and is completely fine with DLSS, so he would be fine with 20~25%.

A bit extreme difference, I know.
 
Joined
Sep 17, 2014
Messages
20,993 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I think my opinion on this aligns with what some others have said about where the GPUs sit in the stack.

In the lower end, the added value of a featureset is near zero, you just want maximum GPU hardware for your money. I don't really count upscale technologies personally, because they're going to be universal in due time, the DLSS lead will evaporate sooner rather than later.

In the higher end, while I don't care much for RT in gaming yet, of course it will get better over time. It already does but while Nvidia leads on it, its still too much of a marketing plaything, a game I'm not playing. To me personally the feature isn't worth much if anything, but I do appreciate the idea of the Nvidia product being able to do cutting edge stuff better. Fun to play around with. Would I pay premium for that... I think so. 15% would be the top end of that in 2024 / current gen.

But looking forward, I don't know if its entirely plausible Nvidia's going to keep their lead. RDNA4 might not compete in the high end, it might just as well (and instead) place more focus on refinement of its featureset. Upscale is already moving that way it seems lately.
 
Joined
Apr 14, 2022
Messages
667 (0.88/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
A bit off topic.


It would be a huge problem for gamers, if nVidia tries to become 3Dfx.
Trying to push the developers to use their SDK for RT acceleration.
 
Joined
Sep 17, 2014
Messages
20,993 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
A bit off topic.


It would be a huge problem for gamers, if nVidia tries to become 3Dfx.
Trying to push the developers to use their SDK for RT acceleration.
Nah its not a problem for gamers at all. We survived 3Dfx too. Nvidia is taking a bigger risk here.

This is a game Nvidia can't win. If they would hypothetically drive all gaming GPUs including consoles, they would have defacto monopoly on x86 gaming and it would trigger a response. Or, ARM would have by then made its way in the market, but I don't see that going places soon. And even then: how could a competitor enter the market if Nvidia owns the tech to play the games? There are a LOT of partners in the value chain that wouldn't like this one bit.
 
Joined
Sep 10, 2018
Messages
5,524 (2.67/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R9 5950X Stock
Motherboard X570 Aorus Master/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory 32 GB 4x8GB 4000CL15 Trident Z Royal/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) LG G2 65/LG C1 48/ LG 27GP850/ MSI 27 inch VA panel 1440p165hz
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Corsair K95 RGB Platinum/ Logitech G Pro
For me AMD has to offer at least a combination of things that equal at least 20-30% more for my money.

That can be a combination of more vram and general performance but I won't touch an AMD card for one of my primary system till they massively improve RT performance in games like CP/Witcher/Alan Wake and massively improve FSR. I currently only consider them in the below 400 usd range where RT and upscaling don't really matter due to being trash at 1080p and too taxing on anything lower than a 4070.

The problem this generation is both the 7600 and 7700XT are meh AF as well as the competing Nvidia cards it really doesn't start to get mildly interesting till we hit 500 usd with the 7800XT/7900GRE 4070/4070 super and I'd only use that class of perfomance at 1080p so it would really come down to my goals with the system.

As for a theoretical 8800XT for 400 usd which seems way too optimistic as we know AMD will just price it 10-20% below whatever replaces the 4080. For me there are too many variables to say for certain but if AMD improve RT/efficiency/upscaling to at least be ballpark with what Nvidia offers then maybe 15% cheaper would have me recommending it over the green option.
 
Joined
Oct 6, 2021
Messages
1,466 (1.55/day)
The only feature I look at is the actual performance and price.

Where is the “not worth a single cent more” option? Biased lol
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,784 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It's objective analysis of data generated by TPU's own testing.

But I'm not surprised the guy with a full AMD rig might think otherwise.
Probably wanted to vote 0% or minus % because 'vibes'.

The feature set has certainly been worth something to me, the last 3.5 years of vastly superior upscaling, solid RT experiences (partly enabled by DLSS), broadcast, and now RTX HDR off the top of my head. Remains to be seen which if any of those features/characteristics will be legitimately matched in coming years. And those features weren't even that mature when I bought, imo the card has aged like... A fine wine, partly due to the sheer muscle it packs, but largely due to the feature refinement and polish.

All this enjoyed enjoyed at both the higher end with a 3080 and lower perf and low power end with an A2000. Seems to be worth at the very least 15% to me for the richer features, if not 20%. I'd happily do an insta buy of an AMD card if it was 75%+ faster than my 3080 and 25%+ cheaper than the competing Nvidia product at raster, there's a point where that's a no brainer. There'd still be a part of me that would miss those features though.
 
Joined
Jun 27, 2011
Messages
6,692 (1.42/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
there's a point where that's a no brainer.
You would think but I have talked to a few people who would be willing to pay whatever for Nvidia's feature set. Ray tracing. upscaling, all of it is stuff they use everyday and AMD is not a viable alternative under any circumstances right now. I asked one guy if the 7900xtx was $1 for everyone and plentiful. Anyone could buy as many as they wanted for $1. He says he would still buy a 4090.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,784 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
You would think but I have talked to a few people who would be willing to pay whatever for Nvidia's feature set. Ray tracing. upscaling, all of it is stuff they use everyday and AMD is not a viable alternative under any circumstances right now. I asked one guy if the 7900xtx was $1 for everyone and plentiful. Anyone could buy as many as they wanted for $1. He says he would still buy a 4090.
Wow what a friend, that seems like he has other motivations beyond the cards themselves maybe, like political as we see so often. Or he just loves the features that much but I mean, 7900XTX for 1$ just seems impossible to refuse.

I can see a possibility in there if you are so unreasonably, disgustingly rich that the difference between $1 and $1600 is utterly insignificant to you, but somehow I doubt that's your friend.
 
Joined
Jun 27, 2011
Messages
6,692 (1.42/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
He has money and really wants the features.
 
Joined
Apr 14, 2022
Messages
667 (0.88/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
Nah its not a problem for gamers at all. We survived 3Dfx too. Nvidia is taking a bigger risk here.

This is a game Nvidia can't win. If they would hypothetically drive all gaming GPUs including consoles, they would have defacto monopoly on x86 gaming and it would trigger a response. Or, ARM would have by then made its way in the market, but I don't see that going places soon. And even then: how could a competitor enter the market if Nvidia owns the tech to play the games? There are a LOT of partners in the value chain that wouldn't like this one bit.

Without Glide support, we could play Unreal and Fifa 98 etc. back then. But they looked like shXt.
When we run the same games in the Voodoo 3, it was like we were seeing the light for first time.

The same could happen now. If you have nVidia RT SDK support, you can play Alan Wake 3, let's say, with all the bell and whistles on. If not, take a 2005 level of graphics.
 
Joined
Sep 17, 2014
Messages
20,993 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Without Glide support, we could play Unreal and Fifa 98 etc. back then. But they looked like shXt.
When we run the same games in the Voodoo 3, it was like we were seeing the light for first time.

The same could happen now. If you have nVidia RT SDK support, you can play Alan Wake 3, let's say, with all the bell and whistles on. If not, take a 2005 level of graphics.
Yeah but that low hanging fruit is gone. Even now RT and raster are often interchangeable.. all games look fine
 
Top