• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen GPUs: What Matters Most to You?

Next-Gen GPUs: What Matters Most to You?

  • Raster Performance

    Votes: 6,487 27.0%
  • RT Performance

    Votes: 2,490 10.4%
  • Energy Efficiency

    Votes: 3,971 16.5%
  • Upscaling & Frame Gen

    Votes: 662 2.8%
  • VRAM

    Votes: 1,742 7.3%
  • Pricing

    Votes: 8,667 36.1%

  • Total voters
    24,019
  • Poll closed .

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
it's not that it's worthless, but it is not a neccessity to play games and enjoy high quality graphics, it's a nice to have, obviously if AMD improve it 2x or whatever over their current lineup then it will be nice to implement, though you confuse worthless with not necessary, it's a stick that NV use to beat AMD with when most AMD owners don't care about RT in games, at this point in time anyway. Heck my 6800 won't run any game with RT turned on as it CTD instantly :laugh: (started a thread about this months ago and never found out what the issue was :confused:) if it was implemented in some games without being able to toggle it off then it would be more of an issue though I can't see that happening as they would not only be limiting AMD customers but lower end spec gamers as well

What's to be confused about. When someone posts that barely anyone has heard about RT or wants it that's pretty clear what they are saying and then went on to make the accusation that the poll is being spammed with RT votes. That's why I posted what I did several comments back before what you are replying to.

The bottom line is that you don't have to use RT so there is not a stick for Nvidia to beat you on the head with to begin with and it is ok for a lot of gamers to want more RT progress as well.
 
Joined
May 7, 2023
Messages
680 (1.14/day)
Processor Ryzen 5700x
Motherboard Gigabyte Auros Elite AX V2
Cooling Thermalright Peerless Assassin SE White
Memory TeamGroup T-Force Delta RGB 32GB 3600Mhz
Video Card(s) PowerColor Red Dragon Rx 6800
Storage Fanxiang S660 1TB, Fanxiang S500 Pro 1TB, BraveEagle 240GB SSD, 2TB Seagate HDD
Case Corsair 4000D White
Power Supply Corsair RM750x SHIFT
What's to be confused about. When someone posts that barely anyone has heard about RT or wants it that's pretty clear what they are saying and then went on to make the accusation that the poll is being spammed with RT votes. That's why I posted what I did several comments back before what you are replying to.

The bottom line is that you don't have to use RT so there is not a stick for Nvidia to beat you on the head with to begin with and it is ok for a lot of gamers to want more RT progress as well.
The vote was spammed, it was raster and price a good margin ahead of RT and within the space of 12+- hours there were thousands of votes for RT, it is a stick for Nvidia just the same as all their proprietry technologies that AMD is behind on to justify the ridiculous prices they are charging for GPU's these days, honestly £$1500-2000 for a top end card and 1200 for the next rung down, is extortionate, and people can mention inflation this and covid, energy prices etc, they are making more money now per SKU than they ever had by a huge margin, it's just a case of "tough shit, this is the price, don't like don't buy" mentality, don't get me wrong AMD are just as greedy and price theirs accordingly, so we don't have x,y,z features but are close enough in raster so we will just go $100-$200 cheaper, and yes you may have had previous comments alluding to this, I didn't quote them, I quoted one and replied to it, we don't have to go back through everything you have ever posted for you to make your point, and my previous reply to you is still valid, IMO.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
The vote was spammed, it was raster and price a good margin ahead of RT and within the space of 12+- hours there were thousands of votes for RT, it is a stick for Nvidia just the same as all their proprietry technologies that AMD is behind on to justify the ridiculous prices they are charging for GPU's these days, honestly £$1500-2000 for a top end card and 1200 for the next rung down, is extortionate, and people can mention inflation this and covid, energy prices etc, they are making more money now per SKU than they ever had by a huge margin, it's just a case of "tough shit, this is the price, don't like don't buy" mentality, don't get me wrong AMD are just as greedy and price theirs accordingly, so we don't have x,y,z features but are close enough in raster so we will just go $100-$200 cheaper, and yes you may have had previous comments alluding to this, I didn't quote them, I quoted one and replied to it, we don't have to go back through everything you have ever posted for you to make your point, and my previous reply to you is still valid, IMO.

I don't see how any of what you posted is an attack on AMD users by Nvidia. Nvidia has a product for sale that performs better in RT than the other products for sale in the market. That is not a stick that anyone is being beaten with. No one is being forced to buy Nvidia products. For now there will be a game here and there that uses some RT whether you want it too or not. No one is being forced to buy any of those games.

Pricing by Nvidia has been way higher than it should have been with Ada. I and others have been pointing that out all along. Not just that but the 4060 and 4070 were named a tier above what they should have been in order to overcharge customers and that was almost true of the 4080 until Nvidia changed the specs before release. I and others have been pointing that out all along as well. Where have I ever said pricing was reasonable with Ada? Show me.

There aren't thousands of votes for better RT performance. As it stands even now there are 1,228 votes. Even if there were thousands of votes for RT it's not a threat to anyone. Nvidia isn't in any position to force RT on you or anyone else. When AMD improves RT performance it's not going to be a stick that they are beating AMD users with either if you are thinking about attacking AMD for better RT performance then as well in the future.
 
Joined
Aug 7, 2023
Messages
28 (0.06/day)
System Name SigmaMATER
Processor Ryzen 7 5800x3d
Motherboard x570 Aorus Master
Cooling Arctic Freezer iii 420
Memory Corsair Dominator Platinum 32gb 3600mz c15
Video Card(s) Rx 7900 xtx
Storage Crucial P5 plus 2tb, Crucial MX500 2tb, Seagate 2tb SSHD, Western Digital 10tb Ultrastar 2x
Display(s) Predator x27 and some lenovo and some other one
Case Corsair 7000D
Power Supply Evga 1600 p2
Mouse Logitech G Pro Wireless
Keyboard Evga z20
Software Windows 11 Enterprise
i live with PG&E where im paying 60c per kw/h so ill take the energy efficiency
 
Joined
Jan 14, 2019
Messages
12,570 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
6000 and 7000 series support AI. So that's 1 less gen than Nvidia.
RX 6000 doesn't support AI. RX 7000 supports it by running it through its shader cores. It's not comparable to Nvidia or CDNA.

Other than this, I agree with your post.

Perhaps "a joke" is a little bit too rude but when DLSS2 from forever ago provides better image quality, more image stability and a smidge more performance than the latest FSR revision it's hard to find an appropriate term. Not to mention that NV's frame generation was better on the day 0 than FSR FG is right now, about a couple years later.
Are we really arguing about which image worsening feature provides less bad image quality? :slap:

Nah, should be reworked. Way too picky about stuff, I often get crashes because of how AMD software works with vsync. AMD drivers are more or less okay if you don't do anything "extraordinary" on your machine. Once you start experimenting you start cussing left, right, and centre because of crashes you can't even fix.
I don't know what's with your card and Vsync, but I don't have that issue. In fact, I don't have any issue with my AMD drivers, and I haven't had any major one in the last 2-3 years.

RX 7000 idle power consumption with high-refresh monitors was fixed quickly, and so was my recent issue of the driver control panel opening every time the monitor wakes up. Minor things quickly sorted.

Just a note: if you have a Freesync monitor, then I wouldn't run Vsync if I were you. It's completely unnecessary, and it doesn't work with Freesync half the time.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Are we really arguing about which image worsening feature provides less bad image quality? :slap:
Yes because normal people (not those like us who can notice 3 pixels worth of distorted data) can use DLSS even at "performance" mode and be completely unaware of the fact the rendering resolution is 4 times less than the monitor's. However, with FSR, you have to live with a whole lot of artifacting. If you own a 4K display the only reasonable causes to avoid upscaling are said upscalers being broken/non-existent in a particular game or benchmarking process. At lower resolutions, however, I'd rather go for DLAA instead.
I don't know what's with your card and Vsync
Cyberpunk 2077 (GOG version) + Cyber Engine Tweaks + Vsync enabled = I'm doomed to see a crash. 100% chance. This is because some AMD GPUs are built the way the conflict between ImGui and AMD drivers becomes irresolvable and it ultimately falls flat on its face. Quite often do issues arise from RAM being less than 100% stable because AMD drivers store data in RAM rather than on an SSD/HDD (nVidia's way). Vsync disabled = massive screen tearing OR limiting myself to 55 FPS (yes, 56 to 60 still tear like hell) OR running 130+ FPS (impossible because RX 6700 XT + i5-12400 combo is too weak even for thinking about it). My monitor doesn't support FreeSync, all it got is Adaptive Sync which is significantly worse.
 
Joined
Jun 22, 2012
Messages
302 (0.07/day)
Processor Intel i7-12700K
Motherboard MSI PRO Z690-A WIFI
Cooling Noctua NH-D15S
Memory Corsair Vengeance 4x16 GB (64GB) DDR4-3600 C18
Video Card(s) MSI GeForce RTX 3090 GAMING X TRIO 24G
Storage Samsung 980 Pro 1TB, SK hynix Platinum P41 2TB
Case Fractal Define C
Power Supply Corsair RM850x
Mouse Logitech G203
Software openSUSE Tumbleweed
Oh an CUDA, cause every gamer does AI and ML with their shiny $1200 Nvidia GPU's which no one ever did before 2022

LLMs and image diffusion models are great for entertainment, perhaps their best use. Seen what people do with Flux?
 
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
Not just that but the 4060 and 4070 were named a tier above what they should have been in order to overcharge customers and that was almost true of the 4080 until Nvidia changed the specs before release.
RTX 4080 is still a classical **70 series (256bit) GPU by old and current standarts but only with overblown TDP and wrong naming!!!

RTX 4080 is 37-50% (1440p/4k) faster than RTX 3080 but costs 72% more. At the end we got new gpu which performance actually is worse than RTX 3080 if you look at cost per frame. RTX 3080 die size is way, way biger and more expensive to make!

Anoher intresting fact.

When GTX 1060 was relesed it was slightly faster than GTX 980 and it has more VRAM. Now 3080 is dramatically faster (~82%) than RTX 4060 and it has more VRAM!
nVIDIA is falling down like a rock performance degradation is just insane.
 
Last edited:
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
RTX 4060 is dramatically slower (~82%) than RTX 3080
If something is slower by 82 percent it means it's 18% as fast as something we're comparing it to. Whilst I agree with you it's nowhere near 82. About 35 to 45 percent depending on resolution. It's the 3080 which is 60 to 82 percent faster.
 
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
If something is slower by 82 percent it means it's 18% as fast as something we're comparing it to. Whilst I agree with you it's nowhere near 82. About 35 to 45 percent depending on resolution. It's the 3080 which is 60 to 82 percent faster.
Right my mistake! :)
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
When GTX 1060 was relesed it was slightly faster than GTX 980 and it has more VRAM. Now RTX 4060 is dramatically slower (~82%) than RTX 3080 and it has less VRAM.
Also some, let's say, "mitigation."

GTX 980 was rated $550 MSRP. 1060, $300 MSRP (let's pretend Founders Edition don't exist). Which means GTX 1060 is about 85 percent more cost efficient.
RTX 3080 was rated $700 MSRP. 4060, $300 MSRP. Which means we got 55% speed for 43% money. Or, about 25 to 30 percent more cost efficient. Slowing down, sure, yet new still beats old. Too bad the margins are so tiny.
 
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
Also some, let's say, "mitigation."

GTX 980 was rated $550 MSRP. 1060, $300 MSRP (let's pretend Founders Edition don't exist). Which means GTX 1060 is about 85 percent more cost efficient.
RTX 3080 was rated $700 MSRP. 4060, $300 MSRP. Which means we got 55% speed for 43% money. Or, about 25 to 30 percent more cost efficient. Slowing down, sure, yet new still beats old. Too bad the margins are so tiny.
GTX 1060 6GB was 249€, GTX 1060 3GB was 199€. FE yes 299€ but that's not selling where i live.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
RTX 4080 is still a classical **70 series (256bit) GPU by old and current standarts but only with overblown TDP and wrong naming!!!

RTX 4080 is 37-50% (1440p/4k) faster than RTX 3080 but costs 72% more. At the end we got new gpu with performance actually is worse than RTX 3080 if you look at cost per frame. RTX 3080 die size is way, way biger and more expensive to make!

Anoher intresting fact.

When GTX 1060 was relesed it was slightly faster than GTX 980 and it has more VRAM. Now RTX 4060 is dramatically slower (~82%) than RTX 3080 and it has less VRAM.
nVIDIA is flying down like a rock performance degradation is just insane.

You are preaching to the choir here about Nvidia MSRP overpricing the entire Ada stack with the exception of the 4090 which is a flagship GPU and they are always priced really high anyway. I have been saying this from the beginning especially about the original 4080 at $1,200 when the 3080 was $700. That was stupid-pricing by Nvidia. Even at $1,000 the 4080 Super is still too high. The naming of the 4060 and 4070 was deceptive on Nvidia's part. I have been saying that from the beginning as well.

The problem comes in, in this case, when AMD fans start spouting nonsense as if it were fact. I know that most of the members here are tech-literate and know that it's BS but there are many that land on a thread like this from a Google search and don't know it's BS. Every once in a while it's a good thing to call out fan BS for their sake and that goes for Nvidia fans, AMD fans, Intel fans, MS apologists etc. I call everyone of them out at times for their shit just as I call out every one of those businesses for their shitty practices. The problem is that whatever group of fans are present when I do call them out assume that I'm a fan of the other side and get defensive. Every company does good things and shitty things. None of them are saints. Not a single one is in business for any other reason than to make a profit (besides non-profit charities). They wouldn't exist otherwise.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Take a look at the latest review of a game here on VRAM use which is just one more example of the 'need moar VRAM' hysteria being nonsense. An entry level gamer on 1080p with 8 GB isn't going to be using ultra settings to begin with and a midrange gamer is fine with 12 GB even on higher settings at greater 1440p.



When people argue opinions over facts most people go with facts and that would mostly explain Nvidia's dominance of the consumer market.


This is cherry picking, not facts. You know that recent releases consume more VRAM on average that the recently released god of war but you are choosing to present it like this anyways. Shame.

I'd also like to re-interate my original point, in that VRAM won't be an industry wide issue as games are made to the hardware available. Some games exceed current VRAM limits on 8GB cards but it's nonsense to expect that to be every game because devs have no choice but to make due with what they have.

Yes, a part of it is mindshare but there's nothing that AMD or Intel can do about that except compete by actually competing and not by slapping unnecessary gobs of VRAM on their specs. That is why their commitment to much improved RT performance in the next gen is smart.

:kookoo: Yes because it makes sense to use expensive die area for RT but not adding comparatively cheap VRAM. You do realize they could do both. Heck they could even do multiple SKUs, one for people who care about the extra VRAM and one of those that do.

Nvidia has done that in the past to no one's surprise. I'm not sure what's with this "it's my way or the highway" logic. I have 32GB of main system memory and 24GB of VRAM that doesn't do anything beneficial for me in 99% of applications over say 16GB main and 12GB VRAM but I have then because I really value the performance boost they do provide in applications that utilize more than average. It's also nice to ensure my PC can handle what comes it's way.

The same principle applies in this discussion, people aren't wrong for wanting more VRAM. There's no objectively "oh your wrong for wanting more VRAM" argument to be made, there's an element of subjectivity where people are very well justified from recent game VRAM usage or simply wanting the capacity to be future proof. Stop acting like your opinion is the only possibly correct one.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,570 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yes because normal people (not those like us who can notice 3 pixels worth of distorted data) can use DLSS even at "performance" mode and be completely unaware of the fact the rendering resolution is 4 times less than the monitor's. However, with FSR, you have to live with a whole lot of artifacting. If you own a 4K display the only reasonable causes to avoid upscaling are said upscalers being broken/non-existent in a particular game or benchmarking process. At lower resolutions, however, I'd rather go for DLAA instead.
"Normal people" who don't care about the image quality degradation with DLSS Performance probably couldn't spot the difference between DLSS and FSR, either. Heck, even I can't sometimes these days. Not to mention, those "normal people" don't own a 4K monitor. 1080p is still the most popular resolution, and with reduced settings, even a 6600 XT runs it just fine without upscaling.

Cyberpunk 2077 (GOG version) + Cyber Engine Tweaks + Vsync enabled = I'm doomed to see a crash. 100% chance. This is because some AMD GPUs are built the way the conflict between ImGui and AMD drivers becomes irresolvable and it ultimately falls flat on its face. Quite often do issues arise from RAM being less than 100% stable because AMD drivers store data in RAM rather than on an SSD/HDD (nVidia's way). Vsync disabled = massive screen tearing OR limiting myself to 55 FPS (yes, 56 to 60 still tear like hell) OR running 130+ FPS (impossible because RX 6700 XT + i5-12400 combo is too weak even for thinking about it). My monitor doesn't support FreeSync, all it got is Adaptive Sync which is significantly worse.
So you're using (supposedly third-party) engine tweaks and unstable RAM, and expect the graphics driver to handle it flawlessly? :kookoo:

Adaptive sync is not Vsync. It only discards frames above your monitor's refresh rate to avoid tearing. It doesn't do Jack sh... below monitor refresh.
 
Joined
Jul 31, 2024
Messages
444 (3.08/day)
I hope you are aware of, that I voted several times for the DRAM Question.

for this question, I'm already allowed to vote several times again.

Different devices and / or different browsers cause multiple votes. But also this browser seems to give me again several times the chance to vote again.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
:kookoo: Yes because it makes sense to use expensive die area for RT but not adding comparatively cheap VRAM. You do realize they could do both. Heck they could even do multiple SKUs, one for people who care about the extra VRAM and one of those that do.

If Nvidia designed a simpler GPU to shave off a few dollars on the cost of having the GPU manufactured then they would be shooting themselves in the foot. You still don't understand why AMD is designing a more expensive GPU to manufacture for the next generation either. The answer is that they know that RT is the future as well and they are moving towards it just like Intel. All of the key players are moving in that direction. Shouldn't that tell you something?

Why design a cheaper inferior GPU to manufacture and set the specs for more VRAM than the end product will need anyway? That doesn't make any sense.

As far as VRAM requirements you can't accept that 8 GB is fine for entry level at 1080p and 12 GB is fine for midrange at 1440p and when I show facts that it is you counter with the 'need moar VRAM' opinion no matter what the facts show. Are the previous amounts I listed enough for every last game out there without lowering some settings? No. But businesses don't make a product more expensive to cover outliers.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
couldn't spot the difference between DLSS and FSR
Nah, you need to be certified blind for that to happen (provided both DLSS and FSR work as intended). However, telling DLSS and native apart is much harder these days.
So you're using (supposedly third-party) engine tweaks and unstable RAM, and expect the graphics driver to handle it flawlessly?
Engine tweaks per se don't cause any of that. It's their GUI that's conflicting with AMD drivers, for the latter's fault. Some AMD GPUs can handle it, some can't. I lost the lottery. And my RAM is OK, I'm just saying it's not the brightest idea to keep VGA driver data in RAM.
 
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
If Nvidia designed a simpler GPU to shave off a few dollars on the cost of having the GPU manufactured then they would be shooting themselves in the foot. You still don't understand why AMD is designing a more expensive GPU to manufacture for the next generation either. The answer is that they know that RT is the future as well and they are moving towards it just like Intel. All of the key players are moving in that direction. Shouldn't that tell you something?

What you are even rambling about, no one said anything about Nvidia designing a simpler GPU.

set the specs for more VRAM than the end product will need anyway? That doesn't make any sense.

Purely your opinion. Just because you wouldn't use / want the VRAM, doesn't mean others wouldn't.

As far as VRAM requirements you can't accept that 8 GB is fine for entry level at 1080p and 12 GB is fine for midrange at 1440p and when I show facts

You mean cherrypicking a single game? :roll:

No. But businesses don't make a product more expensive to cover outliers.

You don't seem to realize that GPUs are designed to service a series of niches that are either still outliers or were outliers in their inception. NVIDIA became big precisely by servicing outlier niches

AI, Upscaling, Multi-monitor, Tessellation, Streaming, ect. These were all outlier use cases to begin with but Nvidia decided to invest resources into them despite that. Some of the appeal of these features grew while others (like streaming) will likely always account for a small amount of users. VR still isn't big but Nvidia's VR support is good.

GPUs are fairly general purpose accelerator cards today and a ton of money is spent to service outlier niches that are an important part of the market for the company or that may grow in the future. Clearly Nvidia should continue to spend money servicing niches because it's precisely that broad support that attracts people to their products.
 
Last edited:

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
What you are even rambling about, no one said anything about Nvidia designing a simpler GPU.

You did. You changed your original argument that Nvidia is trying to save 30% by using less VRAM to fatten their wallet. When I explained to you that the AIBs buy the VRAM to manufacture cards. Nvidia doesn't save anything from setting the specs to lower VRAM. Instead of admitting your error you switched to Nvidia could design an inferior and cheaper GPU to allow AIBs to put gobs more VRAM on entry level cards that don't need it and drive the cost up when price is a critical concern for entry level buyers.

Also, thank you for making the argument that RT is the future. Since it's an outlier right now with only around 500 games using it and more being added regularly doesn't mean that it isn't the future as you clearly pointed out with your other outliers argument.

I hope you are aware of, that I voted several times for the DRAM Question.

Why would you deliberately bork the poll? What did you think you would gain by doing that anyway?
 
Joined
Jul 13, 2016
Messages
3,329 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
You did. You changed your original argument that Nvidia is trying to save 30% by using less VRAM to fatten their wallet.

An obvious lie, I said $30, not 30%.

When I explained to you that the AIBs buy the VRAM to manufacture cards. Nvidia doesn't save anything from setting the specs to lower VRAM. Instead of admitting your error you switched to Nvidia could design an inferior and cheaper GPU to allow AIBs to put gobs more VRAM on entry level cards that don't need it and drive the cost up when price is a critical concern for entry level buyers.

You mean when you misread what I said just like you did above? First let's make this clear, I never said Nvidia buys the VRAM. This is what I said:

You are grouping all devs into a single box and using that as an excuse to say they haven't earned higher VRAM cards. It doesn't make any sense when you have a very literal example of a company, Nvidia, that is not giving us that VRAM to save $30 USD to fatten their 78% margins.

And as I explained in the following post:

No duh, AIB partners are the one's that add the physical memory to the cards but it's Nvidia that decides how much memory can be paired with the GPU in the first place. What I said is 100% correct, Nvidia is the one deciding how much VRAM you get, not board partners.

Your whole argument hinged on your own ignorance of how the market works.

Instead of admitting your error you switched to Nvidia could design an inferior and cheaper GPU to allow AIBs to put gobs more VRAM on entry level cards that don't need it and drive the cost up when price is a critical concern for entry level buyers.

I 100% never said that. That's your own fantasy as I pointed out already.

Who are you trying to gaslight by continuously doubling down on your own misrepresentations of what was actually said? Yourself?

Also, thank you for making the argument that RT is the future. Since it's an outlier right now with only around 500 games using it and more being added regularly doesn't mean that it isn't the future as you clearly pointed out with your other outliers argument.

You definitely misunderstood the argument, there's nothing in there that says an outlier must become mainstream. Only that they should be covered. That's some grade A logical pretzel twisting you are doing. I should state, I have no stake on whether RT is the future or not. It wasn't the topic of the discussion and it doesn't matter to me whether it becomes the norm or not.

You also don't seem to realize that if you agree with my statement (or in this case your own misinterpretation of my statement), that also means you agree that VRAM sizes should be bigger. After all the entire point of that argument was to point out the need to cover outliers as it relates to VRAM allowances on video cards.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
@evernessince

You admit that you said, "Nvidia, that is not giving us that VRAM to save $30 USD to fatten their 78% margins."

That is why I corrected your error. Nvidia doesn't save $30 on the VRAM and it doesn't fatten their margins. The AIBs save the $30 and fatten their margins. Anyway, I am fine with you believing that the big bully Nvidia is wrong in everything they do but I just didn't want anyone not tech informed to land on this thread from some Google search and after reading your comment to believe it was a fact. An unchallenged myth becomes as good as a fact after awhile.

As far as our other disagreement about 8 GB VRAM not being enough for an entry level card I could post numerous examples of games today where it is enough for an entry level card where the goal of the buyer is to have nice quality graphics specs on 1080p but not ultra in everything but it wouldn't change your mind a bit and really it's not my intent to change your mind. As far as the vast majority of buyers are concerned, they already know that 8 GB VRAM is fine for entry level and that would mostly explain why Nvidia dominates the consumer market. I hope you did see the word mostly before you go off on a tangent about Nvidia mindshare because I've already said that is part of the problem for AMD and Intel but it's not most of the problem. Most of AMD's fight right now is that they need to considerably improve RT performance which they have said they are doing in the next generation and improve FSR and stay on top of their drivers.
 
Joined
Jan 14, 2019
Messages
12,570 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Nah, you need to be certified blind for that to happen (provided both DLSS and FSR work as intended). However, telling DLSS and native apart is much harder these days.
The last time I've seen DLSS on my own screen in a live game was with version 2.something, so without knowing better, I'll believe you. I'm just saying that someone on a budget, playing at 1080p won't need upscaling for decent frame rates (it's shit at 1080p anyway).

Engine tweaks per se don't cause any of that. It's their GUI that's conflicting with AMD drivers, for the latter's fault. Some AMD GPUs can handle it, some can't. I lost the lottery. And my RAM is OK, I'm just saying it's not the brightest idea to keep VGA driver data in RAM.
I still think it's a combination of things rather than just the VGA driver acting up.
 
Joined
Feb 24, 2023
Messages
3,126 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
The last time I've seen DLSS on my own screen in a live game was with version 2.something, so without knowing better, I'll believe you.
I'm also speaking from a little dated experience. That was the latest verison... as of Dec'23. However, on my 27" 4K display, I failed to realise why should I use native instead of DLSS P. Image quality was just a smidge worse but I had like 90 percent more FPS.
 
Top