• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Spider-Man 2 Performance Benchmark

Joined
Apr 14, 2022
Messages
789 (0.76/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram.
I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT.
And I repeat that more and more games will require RT acceleration without option to disable it.

I believe the 5080 will be quite faster than the 4080 when most of the neural techniques are utilised.
 
Joined
Jun 2, 2017
Messages
9,730 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram.
I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT.
And I repeat that more and more games will require RT acceleration without option to disable it.

I believe the 5080 will be quite faster than the 4080 when most of the neural techniques are utilised.
So you think developers are jumping at the RT narrative to try what they told us these cards could do. Did you read any reddit responses to neural rendering on the day Jensen said it? AAA studios have always supported the newset tech. For example Jedi Survivor supports Multi GPU. Too bad there is no driver for that anymore. Just like how the Witcher has Hairworks CD Project Red have long been supported by Nvidia. When it comes to RT and upscaling the Industry always goes for what makes the most financial sense. So just like how Gsync is like Betamax Freesync is VHS. Open is always preferred. Just look at how the PS5 bossted NVME speeds and now we can get 4.0 drives that can do up to 6500 Mb/s sequential. It has made 5.0 drives redundant.
 
Joined
May 13, 2008
Messages
822 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Neither can the much more expensive and with twice the vram 7900xtx. So what? Why aren't you complaining about the 7900xtx not keeping 60 fps at that exact same scenario?
Dude, there have been many times a 7900xtx has been cheaper, if not similar-priced to a 4070ti. No, it's no the norm, but it's not atypical either. That is the time people should buy/have bought.

Also, if you buy a 7900xtx you should overclock it. I understand if you don't on nVIDIA bc they place them so it doesn't make any sense.
On AMD it does. Case-in-point, this scenario and 60fps (and/or upscaled ~1080p RT at higher rez). That is why the it is what it is and it has the ram it does.
It will make more sense when 9070 series comes out. I would imagine the goal is 1440p native, 1080p RT for 9070. Probably same for XT. XTX for upscaling from 1080p RT (like 7900XTX but less raster).

I appreciate you, but I have to confront people like you bc the inaccurate mindset of so many people is just plain fucking wrong, and it allows horrible pricing not just from nVIDIA, theoretically AMD as well.

People can console/brand war all they want. I'm here to show you the limitations and why nVIDIA is fleecing you. You are very welcome to continue to be fleeced.
 
Joined
Jun 14, 2020
Messages
4,297 (2.52/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Dude, there have been many times a 7900xtx has been cheaper, if not similar-priced to a 4070ti. No, it's no the norm, but it's not atypical either. That is the time people should buy/have bought.

Also, if you buy a 7900xtx you should overclock it. I understand if you don't on nVIDIA bc they place them so it doesn't make any sense.
On AMD it does. Case-in-point, this scenario and 60fps (and/or upscaled ~1080p RT at higher rez). That is why the it is what it is and it has the ram it does.
It will make more sense when 9070 series comes out. I would imagine the goal is 1440p native, 1080p RT for 9070. Probably same for XT. XTX for upscaling from 1080p RT (like 7900XTX but less raster).

I appreciate you, but I have to confront people like you bc the inaccurate mindset of so many people is just plain fucking wrong, and it allows horrible pricing not just from nVIDIA, theoretically AMD as well.

People can console/brand war all they want. I'm here to show you the limitations and why nVIDIA is fleecing you. You are very welcome to continue to be fleeced.
Yeap, you picked a great example of nvidia fleecing when the much cheaper nvidia card is matching the amd flagship and super expensive XTX. :roll:
 
Joined
May 13, 2008
Messages
822 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram.
I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT.
And I repeat that more and more games will require RT acceleration without option to disable it.

I believe the 5080 will be quite faster than the 4080 when most of the neural techniques are utilised.

4080 is a pretty good 1440p card (or 1080p->1440p some times 1080->4k upscaling RT). Well-matched (but overpriced). It does run out of juice at 4k, same way 5080 runs out of RAM.
The 24GB of ram is shown exactly at 4k non-RT. I'd love to see an overclocked 7900xtx versus an OC 5080...no doubt 7900xtx would win and be a better 4k experience.
5080 16GB is a sham. It...it just is. I don't know what to tell you. The only reason it looks good at 1440p non-rt is bc there isn't any competition.
That doesn't mean it's a well-matched or good card. It truly isn't.
It's not a 4k card. 4090 is a 4k card, 7900xtx is a 4k card. 4090 is a 1440->4k RT upscaling card. 7900xtx is a 1080p->4k RT upscaling card. Both can run 1440p native RT.

Yeap, you picked a great example of nvidia fleecing when the much cheaper nvidia card is matching the amd flagship and super expensive XTX. :roll:

You're not the audience. You've made up your mind to believe what you want, and that's your perogative. I aim toward people with an open mind and forward-thinking.
Let's retable this discussion until 9070XTX (not xt) launches and let's compare 1080p->1440p/4k RT upscale perf vs 4070ti/5070. I hope then you will understand what AMD is doing.
 
Last edited:
Joined
Jun 14, 2020
Messages
4,297 (2.52/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
The amd paranoia. The xtx can't even keep up with the 4080 in new games anymore, but now it's faster than the 5080 :banghead: :banghead:

You're not the audience. You've made up your mind to believe what you want, and that's your perogative. I aim toward people with an open mind and forward-thinking.
Let's retable this discussion until 9070XTX (not xt) launches and let's compare 1080p->1440p/4k RT upscale perf vs 4070ti/5070. I hope then you will understand what AMD is doing.
Believe what I want? Im using the data YOU provided. You linked that graph that clearly shows the cheaper nvidia card matching the much more expensive and twice the vram amd card. Clearly amd fleeced us last gen. According to your graph the 5080 is 38% faster than the XTX, but im the one that believes what he wants, lol. Delusional

EG1. The margins grow even larger in the 4k graph you posted. Clearly the 5080 runs out of vram :roll:
 
Last edited:
Joined
May 13, 2008
Messages
822 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Guy, you don't buy a
The amd paranoia. The xtx can't even keep up with the 4080 in new games anymore, but now it's faster than the 5080 :banghead: :banghead:


Believe what I want? Im using the data YOU provided. You linked that graph that clearly shows the cheaper nvidia card matching the much more expensive and twice the vram nvidia card. Clearly amd fleeced us last gen. According to your graph the 5080 is 38% faster than the XTX, but im the one that believes what he wants, lol. Delusional
I give up. You just don't get it. Hopefully it makes sense to other people. Have fun however you enjoy your games, guy. :)

7900xtx for 4k non-rt w/o a doubt. 5080 for 4k non-rt maybe? Would have to see the dips/ram usage on an OC. The point is that RAM is the limitation there. The future hold better for 7900xtx in raster at 4k.
Both will run native 1440p RT. Neither higher.
4070ti not upscaled from 1080p to 1440p, 7900xtx yes. 7800xt 1080p native, same as 4070 Ti.
My point is if it runs something better, but still can't run it well, is that a win? I say no. If it can run raster better for a higher playable resolution (and often much cheaper) is that a win? I say yes.

Again, I use 60fps as a baseline. You may not. That's fine if you don't. The industry does. You clearly just either don't understand, or are defensive. I'm making a point about aimed resolutions/playability.
They are different aims, no doubt. That will be somewhat similar with 5070 (which will be clocked through the damn ceiling to compete against 9070 in 1080p RT, some games upscaled higher).
That does not change the fact 12GB just isn't enough ram for 1440p RT (and soon probably many cases of 1080p upscaled to other resolutions) and doesn't excuse it's inability to run 1440p/4k bc low ram either!

Like I said, it's a conversation that will make more sense as things evolve. This is just the beginning.
You'll understand better when AMD targets RT resolutions and not just raster. It's not so much the arch changed, but people's perception and hence the target performance aim.
9070 onwards will target RT (upscaled from 1080p or 1080p native) where-as former products target pure raster resolution (1080p/1440p 7600/7800) like 7900xt is 1440p->4k and 7900xtx is 4k in raster.
Because of this 7900xtx does not target 1440p RT (or upscaled to 1440p raster) at stock nor does 7800xt 1080p or upscaled RT from 1080p raster.
9070 series will target 1080p/upscaled from 1080 RT (to 1440p/4k).

Make sense? The absolute performance of a lot of 7000 series is fine, but they are certainly shifting stock perf aim to make people like you happy. Like I said, it won't completely shake out until 3nm cards.

And then 12GB will DEFINITELY suck. Enjoy your 1080p without the ability to upscale to ANYTHING with decent performance in (certainly not RT). :)

I know. I know...You can turn down settings. Kill the shadows. I know. But that's not the point.

People want to play with maximum settings (I would argue not RT until next gen at 1080p or upscaled from 1080p to 1440p/4k) and that is how cards are aimed.

9070 will start that reality, 4070ti/5070 will not continue it. The next-gen will be squarely aimed at that across the board and for scaling from higher rez (1440p->4k) outside of something like a 4090.
The 192-bit cards will replace 9070/5080 and do it better...and those will be the aim bc similar perf to a PS6.
Some games will be limited by 16GB, some not. That's my point. 9070 is fine, but 192-bit 3nm better for longevity.

45-90TF a nice scale. Under that 12GB is fine but soon to be outdated (as it'll be below console capability). This includes ALLLL THE "RT" cards you've seen up until this standard.

45TF starts needing 16GB, which is good up to around 60TF. 7800xt is pretty much limited to 45TF for this reason (to make 9070 look good for up 60TF), and so is 4070ti 12GB and 5070 (<45TF).

90TF is around absolute performance of 24GB. This is a 4090. It is a good design bc I don't think most people are going to play above 1440p upscaled and/or use >24GB (need 32GB) most of the time.
256-bit 3nm cards will be targeted at this spec using 24GB GDDR7.

PS6 probably ~60TF. So, IOW, PS6 will use a minimum of 16GB and conceivably more (absolute perf of 7900xt 60TF and it's 20GB), while likely targeting ~1080->4k RT worse-case.
Conceivably higher-resolution with/without RT.
192-bit cards will target this using 18GB GDDR7.

I don't think it's that difficult to understand? You can literally look at every card and see what I'm talking about.
Also, again, why 5080 16GB is a travesty because they clearly don't want it to be a 4k native card. It's clocked at 2640mhz at stock so it doesn't exceed 60TF and have ram limitations be clearly seen.
This is literally clear as day. Maybe it's not clear to some. I don't know. I'm a nerd. It's the truth though (that it's clear...also that I'm a nerd).
This is why 9070XTX, which will likely reach CLOSE to 60TF absolute performance (meaning 8192sp @ high-as-hell clockspeed) is a smart/cheap design bc high-clocked 256-bit 16GB GDDR6.
5080 is so badly designed as a stop-gap to the card you want (a 4090) it's riduclous how blatant it is.
The fact they don't clock it over 60TF with 16GB or clock it higher with 24GB telling (that's a Super, which will also be trash bc still not 4090 raster while 3nm cards will be).

I need to teach courses on this shit or something. I'd give you extra homework. :p
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,459 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It has higher minimums than the more expensive 7900xt and is close enough to the 24gb super expensive xtx. But it runs out of vram... Okay
Theres lots of selective parts of this being discussed right, I'ma add to that below .... :laugh:
7 years since the introduction of ray tracing and we still only have reflections on glass paying half the total fps... embarrassing.
Thanks Nvidia for this legalized scam, let's wait for the PS6.
It's a Console first developed game ported to PC, happy to share some game titles with transformative RT if you're unaware of them?

-------------------------------------------------------

Looks like Nvidia gigabytes worth more than AMD gigabytes

1738626530760.png
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Looks like Nvidia gigabytes worth more than AMD gigabytes

View attachment 383133
To be fair I have a 4090 (Desktop) & 6850M XT (Laptop) and AMD has a cleaner & sharper image quality compared to Nvidia. You can check online there are plenty of forums or videos talking about that. Bang4BuckPC Gamer did several videos about it on his YT channel. I feel like Nvidia compress things a lot more than AMD do, hence the lower memory usage.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,459 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
To be fair I have a 4090 (Desktop) & 6850M XT (Laptop) and AMD has a cleaner & sharper image quality compared to Nvidia. You can check online there are plenty of forums or videos talking about that. Bang4BuckPC Gamer did several videos about it on his YT channel. I feel like Nvidia compress things a lot more than AMD do, hence the lower memory usage.
I've looked into that extensively myself and found it to be entirely down to settings, be it driver, control panel, monitor etc.
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
I've looked into that extensively myself and found it to be entirely down to settings, be it driver, control panel, monitor etc.
The only Visual option that I change in the NVCC settings is Anisotropic Filtering to 16x but I can definitely tell that AMD has a sharper/cleaner picture and sometimes better colors than Nvidia by default. I have an LG OLED 55" C9 TV calibrated with Calman and also an AORUS FO32U2P (32" 4K QD-OLED 240Hz) monitor with the sRGB calibrated profile, and can definitely see the difference on both.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,459 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
The only Visual option that I change in the NVCC settings is Anisotropic Filtering to 16x but I can definitely tell that AMD has a sharper/cleaner picture and sometimes better colors than Nvidia by default. I have an LG OLED 55" C9 TV calibrated with Calman and also an AORUS FO32U2P (32" 4K QD-OLED 240Hz) monitor with the sRGB calibrated profile, and can definitely see the difference on both.
I can't tell you that you don't see what you see, but also having done my own extensive testing, I see what I see. I find it's mostly down to default settings and when i've conmfigured things properly I see no sharpness/clearness advantage either way. Honestly I thought this was debunked, but again I can't see your setup through your eyes so power to you I suppose. As W1zzard says, both arch's manage their memory differently, which more readily explains the difference.
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
I can't tell you that you don't see what you see, but also having done my own extensive testing, I see what I see. I find it's mostly down to default settings and when i've conmfigured things properly I see no sharpness/clearness advantage either way. Honestly I thought this was debunked, but again I can't see your setup through your eyes so power to you I suppose. As W1zzard says, both arch's manage their memory differently, which more readily explains the difference.
Yeah we're all different so we might not see things the same way which is fine. But there's definitely more compression on Nvidia's side that's for sure.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,459 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Yeah we're all different so we might not see things the same way which is fine. But there's definitely more compression on Nvidia's side that's for sure.
That's been somewhat evident to me through apples to apples VRAM allocation and useage testing too, where often Nvidia cards use less, but I cannot see any visible difference in quality because of it.
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
That's been somewhat evident to me through apples to apples VRAM allocation and useage testing too, where often Nvidia cards use less, but I cannot see any visible difference in quality because of it.
Good for you then! I definitely see one and it's kinda bothering me because I wish I could get Nvidia's GPUs performance with AMD's picture quality lol. We definitely need more competition!! :D
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,459 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Good for you then!
haha good for me indeed I guess, I have excellent vision and really looked into this so... *shrug*

In any case my post was meant to just have a cheeky jab/laugh in meme form to lighten the mood from some of the typical despondency and negativity.
 
Joined
Jun 14, 2020
Messages
4,297 (2.52/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Theres lots of selective parts of this being discussed right, I'ma add to that below .... :laugh:

It's a Console first developed game ported to PC, happy to share some game titles with transformative RT if you're unaware of them?

-------------------------------------------------------

Looks like Nvidia gigabytes worth more than AMD gigabytes

View attachment 383133
That has been true for a long time, you can see a lot of games when the 8gb amd cards completely tank due to vram while the nvidia ones work just fine. Basically amd doesn't give you more vram, they just need more vram to operate.

haha good for me indeed I guess, I have excellent vision and really looked into this so... *shrug*

In any case my post was meant to just have a cheeky jab/laugh in meme form to lighten the mood from some of the typical despondency and negativity.
There is no difference, amd has some stupid options in the adrenaline that completely destroy the picture but I guess some people find it better looking. I know, I've tested excessively, after you install the drivers on an AMD GPU colors change completely. Settings ---> Display and there are 3 image destroying options, varibright, color enhancement and color temperature control. Im looking at them right now
 
Joined
May 13, 2008
Messages
822 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I don't know if anyone cares, but I figure the limitation of 16GB is 60.618TF. This would be a 4080 at 3116mhz, a 4080S at 2960mhz, 7900xt/5080 at 2819mhz, or a full N48 (8192) at exactly 3700mhz.
I mean, yeah, N48 would require like 27.2gbps of RAM bw and that ain't happening, but lots of times used as ~7511.1_/64 ROPs, would mean needing only ~25000mhz, which is outside-chance possible!
So the limitation of 18GB is ~68.2TF, which is equal to a 3172mhz 5080 (or 7900xt, which is likely locked-down [PL/ram bus] so it won't do this...limited to ~2800mhz/60TF).
Totally weird how the typical overclock on a 5080 is 3154mhz. Except, again, not at all. That'll probably be a Super 24GB (which again doesn't need 24GB) or approximate 18GB cards from both companies.

~68.2TF/9216 = 3700mhz. I would be willing to bet that's nVIDIA's 3nm 'RU104' placement.
Likewise 12288sp @ 3700mhz fits for 24GB. That's (probably) nvidia's lineup unless nVIDIA pins 11264 exactly (yield/size/higher clocks?) and 96 ROPs.

Really would like to see AMD shoot for 11520sp (6*1920sp) and ~3.87-~4ghz/40000 stock.
They might go 12288sp as well for the extra engines (8?) for RT/whatever but I think 6 is fine. Over 112 ROPs isn't needed, 96 ok. AMD doesn't like odd ratios (ex: 112 ROPs/7 engines/88 CUs etc), nvidia might.

The BEST 24GB setup would be 11264 @ 4037mhz and would only require 96 ROPs. Will anyone do that? I dunno bc of effing RT/whatever and therefore using more engines we don't really (optimally) need.
Only bc nVIDIA made the market this way on purpose to push (too high ratio) things outside FP32 compute, which is ridiculous, when other-wise the optimal setup would be obvious even w/ decent other stuff.
If either can do that (4037mhz/11264sp, even if overshot) I will probably buy it. Because I love when shit is 'optimal' and nothing is wasted or left without using it's full potential (hence why I dislike nVIDIA).
FWIW Apple N3E chips (A-series) are 4050mhz. N5 was 2.93/~3.2 (which you see on current GPU products). N5P/N4P 3460mhz-3700mhz (5070/N48?). Apple is a pretty good indication of what to expect.
I expect the PS6 to be 2/3 of this, roughly (done with high-density libraries for low-power/clockspeed around threshold voltage). Could be a little lower in total TF, but ~11264 @ 2600-2700mhz bc MC smart af.

With nVIDIA's current cache it requires 36gbps ram for 12288sp @ 3780mhz, so less for 3700mhz, but I would prefer higher clockspeed. Will nVIDIA let it (or the ram) clock that high (4037mhz)? Dunno.
With AMD it would require ~40.816gbps ram using their current 256-bit L3 setup for my ideal config (11264/4037mhz). Hardly impossible with an OC on 40gbps GDDR7.

That is my "numbers for nit nerds" of the GPU world. I know nobody asked...so you don't have to say it. :D

I didn't figure out the 192-bit parts ratios, but if AMD uses <9216 (say 7680/8192 or something) they would have to clock it (or make it capable of clocking) REAL high, with good ram OCs...kinda like a 3nm N48.
So yeah, 9216sp makes sense. The weird part is that is optimally ~80 ROPs, so you know nVIDIA will do it, but AMD would probably do 64/96. That's why it's tough to nail down what they'll do specifically.

I apologize if some find that stuff boring (especially in a Spiderman thread), but just putting it out there to know what to expect and/or want if you care. It's cool if you don't.
 
Last edited:
Joined
Jul 9, 2021
Messages
102 (0.08/day)
gpu usage is 16157
rtx 4060 ti have 16384
will 227mb be enough for windows ui, background etc and an edge browser in background to display plain text?
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
haha good for me indeed I guess, I have excellent vision and really looked into this so... *shrug*

In any case my post was meant to just have a cheeky jab/laugh in meme form to lighten the mood from some of the typical despondency and negativity.
Cool, except YOU are the one who don't see a difference! But I have tried on several 4K TVs and Monitors (including Professionally Calibrated with Calman) and I do see a difference! Also I have more than 20/20 vision so yeah thanks for the laugh :laugh:
I was not being negative, I was just saying if you don't see a difference then that's great for you, but I do see one and would prefer if I didn't see one. End of the topic.

There is no difference, amd has some stupid options in the adrenaline that completely destroy the picture but I guess some people find it better looking. I know, I've tested excessively, after you install the drivers on an AMD GPU colors change completely. Settings ---> Display and there are 3 image destroying options, varibright, color enhancement and color temperature control. Im looking at them right now
I disable all the vibrance and other visual settings etc. because I calibrate my screens with Calman, and there is a difference. You might just not see them and that's fine. It's not because you don't see it that others can't. Not everyone sees the same things anyway.
 
Joined
Sep 20, 2021
Messages
548 (0.44/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
Theres lots of selective parts of this being discussed right, I'ma add to that below .... :laugh:

It's a Console first developed game ported to PC, happy to share some game titles with transformative RT if you're unaware of them?

-------------------------------------------------------

Looks like Nvidia gigabytes worth more than AMD gigabytes

View attachment 383133
This seems interesting because the 7600 8GB has 476.9GB/s bandwidth and the 4060 8GB has 272GB/s and this is easy to verify/prove.
Yes, Nvidia have better memory compression and that helps sometimes.
So the problem is somewhere else in the "meme" situation.

As an example, I can tell you that the 7900 XT/XTX (2900/3500GB/s) loads games on my PC faster than the RTX 4080(736GB/s), especially when the needed files are in the RAMDrive.
 
Joined
May 6, 2019
Messages
97 (0.05/day)
Location
UK
Processor i5-10600
Motherboard MSI Z490-A Pro
Cooling Arctic Freezer 34 eSports DUO
Memory Kingston Fury Renegade 32GB DDR4-3600 CL16
Video Card(s) ASUS GeForce RTX 4060 Ti 16GB
Storage Samsung 980 Pro 1TB, 860 EVO 2TB
Display(s) iiyama G2730HSU-B1 G-Master 27"
Case Corsair 200R
Audio Device(s) Asus Xonar Essence STX
Power Supply Seasonic Focus GX 650W
Software Windows 10 Pro
For the first time there is such a big gap between 4060 ti 16gb vs 8gb at 1080p (or I didn’t notice earlier)

The Last of Us Part 1 is similar. There are others and will be more as time goes on, despite many ridiculing the card on release for not being fast enough for the extra memory to make any difference. (Indiana Jones is another one IIRC.)

One card which does deserve ridicule and didn't deserve the hype is the B580. Nobody should be buying an Intel card, they are a poor alternative to nvidia or AMD.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,459 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Looks like were both laughing then :)
Yup! I guess it's time for you to make an appointment with CRISPR to get some DNA alteration for your eyes lol
 
Top