• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Spider-Man 2 Performance Benchmark

Joined
Apr 14, 2022
Messages
788 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram.
I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT.
And I repeat that more and more games will require RT acceleration without option to disable it.

I believe the 5080 will be quite faster than the 4080 when most of the neural techniques are utilised.
 
Joined
Jun 2, 2017
Messages
9,707 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram.
I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT.
And I repeat that more and more games will require RT acceleration without option to disable it.

I believe the 5080 will be quite faster than the 4080 when most of the neural techniques are utilised.
So you think developers are jumping at the RT narrative to try what they told us these cards could do. Did you read any reddit responses to neural rendering on the day Jensen said it? AAA studios have always supported the newset tech. For example Jedi Survivor supports Multi GPU. Too bad there is no driver for that anymore. Just like how the Witcher has Hairworks CD Project Red have long been supported by Nvidia. When it comes to RT and upscaling the Industry always goes for what makes the most financial sense. So just like how Gsync is like Betamax Freesync is VHS. Open is always preferred. Just look at how the PS5 bossted NVME speeds and now we can get 4.0 drives that can do up to 6500 Mb/s sequential. It has made 5.0 drives redundant.
 
Joined
May 13, 2008
Messages
796 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Neither can the much more expensive and with twice the vram 7900xtx. So what? Why aren't you complaining about the 7900xtx not keeping 60 fps at that exact same scenario?
Dude, there have been many times a 7900xtx has been cheaper, if not similar-priced to a 4070ti. No, it's no the norm, but it's not atypical either. That is the time people should buy/have bought.

Also, if you buy a 7900xtx you should overclock it. I understand if you don't on nVIDIA bc they place them so it doesn't make any sense.
On AMD it does. Case-in-point, this scenario and 60fps (and/or upscaled ~1080p RT at higher rez). That is why the it is what it is and it has the ram it does.
It will make more sense when 9070 series comes out. I would imagine the goal is 1440p native, 1080p RT for 9070. Probably same for XT. XTX for upscaling from 1080p RT (like 7900XTX but less raster).

I appreciate you, but I have to confront people like you bc the inaccurate mindset of so many people is just plain fucking wrong, and it allows horrible pricing not just from nVIDIA, theoretically AMD as well.

People can console/brand war all they want. I'm here to show you the limitations and why nVIDIA is fleecing you. You are very welcome to continue to be fleeced.
 
Joined
Jun 14, 2020
Messages
4,271 (2.52/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Dude, there have been many times a 7900xtx has been cheaper, if not similar-priced to a 4070ti. No, it's no the norm, but it's not atypical either. That is the time people should buy/have bought.

Also, if you buy a 7900xtx you should overclock it. I understand if you don't on nVIDIA bc they place them so it doesn't make any sense.
On AMD it does. Case-in-point, this scenario and 60fps (and/or upscaled ~1080p RT at higher rez). That is why the it is what it is and it has the ram it does.
It will make more sense when 9070 series comes out. I would imagine the goal is 1440p native, 1080p RT for 9070. Probably same for XT. XTX for upscaling from 1080p RT (like 7900XTX but less raster).

I appreciate you, but I have to confront people like you bc the inaccurate mindset of so many people is just plain fucking wrong, and it allows horrible pricing not just from nVIDIA, theoretically AMD as well.

People can console/brand war all they want. I'm here to show you the limitations and why nVIDIA is fleecing you. You are very welcome to continue to be fleeced.
Yeap, you picked a great example of nvidia fleecing when the much cheaper nvidia card is matching the amd flagship and super expensive XTX. :roll:
 
Joined
May 13, 2008
Messages
796 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
The 12GB and the 16GB (4070, 4080 etc.)will not save you because the cars will run out of juice first (see Indiana Jones PT) and then out of vram.
I don’t see the point having a truckload of vram when the actual arch does not let you run demanding RT or PT.
And I repeat that more and more games will require RT acceleration without option to disable it.

I believe the 5080 will be quite faster than the 4080 when most of the neural techniques are utilised.

4080 is a pretty good 1440p card (or 1080p->1440p some times 1080->4k upscaling RT). Well-matched (but overpriced). It does run out of juice at 4k, same way 5080 runs out of RAM.
The 24GB of ram is shown exactly at 4k non-RT. I'd love to see an overclocked 7900xtx versus an OC 5080...no doubt 7900xtx would win and be a better 4k experience.
5080 16GB is a sham. It...it just is. I don't know what to tell you. The only reason it looks good at 1440p non-rt is bc there isn't any competition.
That doesn't mean it's a well-matched or good card. It truly isn't.
It's not a 4k card. 4090 is a 4k card, 7900xtx is a 4k card. 4090 is a 1440->4k RT upscaling card. 7900xtx is a 1080p->4k RT upscaling card. Both can run 1440p native RT.

Yeap, you picked a great example of nvidia fleecing when the much cheaper nvidia card is matching the amd flagship and super expensive XTX. :roll:

You're not the audience. You've made up your mind to believe what you want, and that's your perogative. I aim toward people with an open mind and forward-thinking.
Let's retable this discussion until 9070XTX (not xt) launches and let's compare 1080p->1440p/4k RT upscale perf vs 4070ti/5070. I hope then you will understand what AMD is doing.
 
Last edited:
Joined
Jun 14, 2020
Messages
4,271 (2.52/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
The amd paranoia. The xtx can't even keep up with the 4080 in new games anymore, but now it's faster than the 5080 :banghead: :banghead:

You're not the audience. You've made up your mind to believe what you want, and that's your perogative. I aim toward people with an open mind and forward-thinking.
Let's retable this discussion until 9070XTX (not xt) launches and let's compare 1080p->1440p/4k RT upscale perf vs 4070ti/5070. I hope then you will understand what AMD is doing.
Believe what I want? Im using the data YOU provided. You linked that graph that clearly shows the cheaper nvidia card matching the much more expensive and twice the vram amd card. Clearly amd fleeced us last gen. According to your graph the 5080 is 38% faster than the XTX, but im the one that believes what he wants, lol. Delusional

EG1. The margins grow even larger in the 4k graph you posted. Clearly the 5080 runs out of vram :roll:
 
Last edited:
Joined
May 13, 2008
Messages
796 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Guy, you don't buy a
The amd paranoia. The xtx can't even keep up with the 4080 in new games anymore, but now it's faster than the 5080 :banghead: :banghead:


Believe what I want? Im using the data YOU provided. You linked that graph that clearly shows the cheaper nvidia card matching the much more expensive and twice the vram nvidia card. Clearly amd fleeced us last gen. According to your graph the 5080 is 38% faster than the XTX, but im the one that believes what he wants, lol. Delusional
I give up. You just don't get it. Hopefully it makes sense to other people. Have fun however you enjoy your games, guy. :)

7900xtx for 4k non-rt w/o a doubt. 5080 for 4k non-rt maybe? Would have to see the dips/ram usage on an OC. The point is that RAM is the limitation there. The future hold better for 7900xtx in raster at 4k.
Both will run native 1440p RT. Neither higher.
4070ti not upscaled from 1080p to 1440p, 7900xtx yes. 7800xt 1080p native, same as 4070 Ti.
My point is if it runs something better, but still can't run it well, is that a win? I say no. If it can run raster better for a higher playable resolution (and often much cheaper) is that a win? I say yes.

Again, I use 60fps as a baseline. You may not. That's fine if you don't. The industry does. You clearly just either don't understand, or are defensive. I'm making a point about aimed resolutions/playability.
They are different aims, no doubt. That will be somewhat similar with 5070 (which will be clocked through the damn ceiling to compete against 9070 in 1080p RT, some games upscaled higher).
That does not change the fact 12GB just isn't enough ram for 1440p RT (and soon probably many cases of 1080p upscaled to other resolutions) and doesn't excuse it's inability to run 1440p/4k bc low ram either!

Like I said, it's a conversation that will make more sense as things evolve. This is just the beginning.
You'll understand better when AMD targets RT resolutions and not just raster. It's not so much the arch changed, but people's perception and hence the target performance aim.
9070 onwards will target RT (upscaled from 1080p or 1080p native) where-as former products target pure raster resolution (1080p/1440p 7600/7800) like 7900xt is 1440p->4k and 7900xtx is 4k in raster.
Because of this 7900xtx does not target 1440p RT (or upscaled to 1440p raster) at stock nor does 7800xt 1080p or upscaled RT from 1080p raster.
9070 series will target 1080p/upscaled from 1080 RT (to 1440p/4k).

Make sense? The absolute performance of a lot of 7000 series is fine, but they are certainly shifting stock perf aim to make people like you happy. Like I said, it won't completely shake out until 3nm cards.

And then 12GB will DEFINITELY suck. Enjoy your 1080p without the ability to upscale to ANYTHING with decent performance in (certainly not RT). :)

I know. I know...You can turn down settings. Kill the shadows. I know. But that's not the point.

People want to play with maximum settings (I would argue not RT until next gen at 1080p or upscaled from 1080p to 1440p/4k) and that is how cards are aimed.

9070 will start that reality, 4070ti/5070 will not continue it. The next-gen will be squarely aimed at that across the board and for scaling from higher rez (1440p->4k) outside of something like a 4090.
The 192-bit cards will replace 9070/5080 and do it better...and those will be the aim bc similar perf to a PS6.
Some games will be limited by 16GB, some not. That's my point. 9070 is fine, but 192-bit 3nm better for longevity.

45-90TF a nice scale. Under that 12GB is fine but soon to be outdated (as it'll be below console capability). This includes ALLLL THE "RT" cards you've seen up until this standard.

45TF starts needing 16GB, which is good up to around 60TF. 7800xt is pretty much limited to 45TF for this reason (to make 9070 look good for up 60TF), and so is 4070ti 12GB and 5070 (<45TF).

90TF is around absolute performance of 24GB. This is a 4090. It is a good design bc I don't think most people are going to play above 1440p upscaled and/or use >24GB (need 32GB) most of the time.
256-bit 3nm cards will be targeted at this spec using 24GB GDDR7.

PS6 probably ~60TF. So, IOW, PS6 will use a minimum of 16GB and conceivably more (absolute perf of 7900xt 60TF and it's 20GB), while likely targeting ~1080->4k RT worse-case.
Conceivably higher-resolution with/without RT.
192-bit cards will target this using 18GB GDDR7.

I don't think it's that difficult to understand? You can literally look at every card and see what I'm talking about.
Also, again, why 5080 16GB is a travesty because they clearly don't want it to be a 4k native card. It's clocked at 2640mhz at stock so it doesn't exceed 60TF and have ram limitations be clearly seen.
This is literally clear as day. Maybe it's not clear to some. I don't know. I'm a nerd. It's the truth though (that it's clear...also that I'm a nerd).
This is why 9070XTX, which will likely reach CLOSE to 60TF absolute performance (meaning 8192sp @ high-as-hell clockspeed) is a smart/cheap design bc high-clocked 256-bit 16GB GDDR6.
5080 is so badly designed as a stop-gap to the card you want (a 4090) it's riduclous how blatant it is.
The fact they don't clock it over 60TF with 16GB or clock it higher with 24GB telling (that's a Super, which will also be trash bc still not 4090 raster while 3nm cards will be).

I need to teach courses on this shit or something. I'd give you extra homework. :p
 
Last edited:
Top