• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Spider-Man 2 Performance Benchmark

Yup! I guess it's time for you to make an appointment with CRISPR to get some DNA alteration for your eyes lol
I'm good, I'll look at anything you want to provide as evidence though, till then I'll put it in the regular bucket when people mention this... the trust me bro bucket.
 
I'm good, I'll look at anything you want to provide as evidence though, till then I'll put it in the regular bucket when people mention this.. the trust me bro bucket.
What evidence do you want? If your eyes can't see it I can't do anything for you man... better ask your ophthalmologist.
 
What evidence do you want? If your eyes can't see it I can't do anything for you man... better ask your ophthalmologist.
There's absolutely nothing wrong with my vision, and continuing to just insinuate there is, I find a particularly weak argument. Seems much more likely that there is a configuration difference in your drivers, game settings, windows monitor settings etc, some explainable difference, between GeForce and Radeon systems and config you've tested. I've not seen anyone able to demonstrate this difference conclusively in well over a decade, where it wasn't attributed to configuration differences. So based on that, what I've seen myself, and what the broader community and tech press have found, I'm yet to see anyone actually demonstrate Radeons just generally providing a clearer and sharper image. Want to link me to the bang4buck video/s you mention? I'll happily take a look. I'd appreciate it if you stopped insulting my vision, it's not a path to proving anything, and isn't the argument winning approach you might think it is.
There is no difference
I thought it was known and accepted at this point, but every so often someone comes along, insists there is, but can't prove it. It usually either falls apart fast when attempted to be shown, is explained by configuration, or is just trust me bro. I'm a reasonable person, and will take on new information and allow it to alter my knowledge and beliefs if it's reasonable, compelling, conclusive etc, I'm yet to see it though.
 
There's absolutely nothing wrong with my vision, and continuing to just insinuate there is, I find a particularly weak argument. Seems much more likely that there is a configuration difference in your drivers, game settings, windows monitor settings etc, some explainable difference, between GeForce and Radeon systems and config you've tested. I've not seen anyone able to demonstrate this difference conclusively in well over a decade, where it wasn't attributed to configuration differences. So based on that, what I've seen myself, and what the broader community and tech press have found, I'm yet to see anyone actually demonstrate Radeons just generally providing a clearer and sharper image. Want to link me to the bang4buck video/s you mention? I'll happily take a look. I'd appreciate it if you stopped insulting my vision, it's not a path to proving anything, and isn't the argument winning approach you might think it is.

I thought it was known and accepted at this point, but every so often someone comes along, insists there is, but can't prove it. It usually either falls apart fast when attempted to be shown, is explained by configuration, or is just trust me bro. I'm a reasonable person, and will take on new information and allow it to alter my knowledge and beliefs if it's reasonable, compelling, conclusive etc, I'm yet to see it though.
How in the world a grownup thinks that an nvidia gpu, say a 4090, has a problem with displaying colors? Or what exactly is the argument? I don't get it. Anyways, lets agree that we don't see it, just like we don't see that the earth is flat. Some people can though :toast:

EG1. Just think about the professionals that buy 5-10-20k reference monitors to have pixel perfect colors. What GPU do you reckon they are running? Guess they don't see it either
 
How in the world a grownup thinks that an nvidia gpu, say a 4090, has a problem with displaying colors?
It's not a problem, it's just Nvidia has a different gamma, you can set it to look like Radeon through the settings, it's about Nvidia's vision, it's not a big deal, but yes, there is a difference.
 
It's not a problem, it's just Nvidia has a different gamma, you can set it to look like Radeon through the settings, it's about Nvidia's vision, it's not a big deal, but yes, there is a difference.
Even if that's the case that's as already been stated a settings issue. But why would you want to make it look like Radeon? Do we have some comparisons vs a reference image and the amd settings look closer to reference or what?
 
Even if that's the case that's as already been stated a settings issue. But why would you want to make it look like Radeon? Do we have some comparisons vs a reference image and the amd settings look closer to reference or what?
It's not a issue, it's just that some people prefer slightly warmer colours and others don't. So as was said above, if you want truer colors, just calibrate your monitor.
One other thing is to check if you are using more than 8 bit colors Some monitors support 10 (12) bit colors, which can also play a role for a better picture.

But that's not for this thread, if you want raise/search for a thread about color gamuts where to discuss color gamuts.
 
16GB VRAM finally meets its demise.

I do not get it

the 3070 and 3070 ti are special with those 8GiB VRAM.

Else I see differences in WHQD wiht more VRAM with the same card, regardless if amd or nvidia.

--

Kinda amusing. In Future there will be ... more demand ... we will see differences ... and so on.
I have the graphic card now, with that operating system and these operating system patches and security fixes, with these games. Period.
 
Last edited:
16GB VRAM finally meets its demise.
The game engine reserves more vram than it actually using, so those vram usage numbers are not really useful. For example you can check the 4K+RT chart where according to the chart, the vram usage is around 14,5GB, so a 12gb vga should stutter, etc. Meanwhile the 4070ti 12gb performs the same as the 3090Ti 24GB, even in the minimum fps, so in the real world, the game needs less than 12GB vram in 4k max+rt. To be honest it looks like even with the 3080 perform pretty okay, doing as expected based on the power of its chip. So maybe real vram usage is around 10GB+ i guess?

So even with path tracing, framegen, etc, the vram usage probably not really close to 16gb...more like 13-14gb.
 
Last edited:
The game engine reserves more vram than it actually using, so those vram usage numbers are not really useful. For example you can check the 4K+RT chart where according to the chart, the vram usage is around 14,5GB, so a 12gb vga should stutter, etc. Meanwhile the 4070ti 12gb performs the same as the 3090Ti 24GB, even in the minimum fps, so in the real world, the game needs less than 12GB vram in 4k max+rt. To be honest it looks like even with the 3080 perform pretty okay, doing as expected based on the power of its chip. So maybe real vram usage is around 10GB+ i guess?

So even with path tracing, framegen, etc, the vram usage probably not really close to 16gb...more like 13-14gb.
It could also mean slow-loading texture and you'll get more pop-ins during gameplay which will ruin your immersion. I experienced that in Ratchet and Clank on my laptop with 12GB VRAM running out.
 
Last edited:
AMD has some serious catching up to do when it comes to RT. The fact that a 7900XT with 20GB of vram is behind a 4070 super when you turn on RT is insane.
 
Thank you for finally adding upscaling results! As someone who believes DLSS-Q to be pretty much free fps at 1440p or above and use it everywhere, I really needed that!
 
Back
Top