• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What was lacking GPU-wise at this year's CES

I don't think there are any 9070s at ces period it would also be strange to demonstrate the weaker card.
I don't know if it was physically there at CES or not, but it's here in the presentation:

PuQNumJ90uezFK2y.jpg


I agree that demoing the weaker version isn't logical, but I have to assume that IGN would have specified if it was the XT that they benchmarked.
 
i think it is way too many typos and the article is now a few hours old, it would be a bit odd to still be up and not corrected.
 
i think it is way too many typos and the article is now a few hours old, it would be a bit odd to still be up and not corrected.

On the game screen it just says RDNA4 graphics card. Benchmarks using the tool seem to be all over the place online as high as 120fps for the 7900XTX and around 100fps for the 7900XT
 
On the game screen it just says RDNA4 graphics card. Benchmarks using the tool seem to be all over the place online as high as 120fps for the 7900XTX and around 100fps for the 7900XT
It would be nice if I played the game so I could compare it to my 6750 XT... or if they'd tested a different game.

Has any one of you got the game by any chance?
 
Nvidia has just announced a new version of Lossless Scaling, of course only integrated into some games, if of course we buy a 5000 card, of course we need third party reviewers to tell us if there is a difference between Nvidia's new blur and Lossless Scaling so we can decide if we need a new card or just buy this tool for the same fake frames on our old 4000 cards :)
Of course, if we're just interested in more fake frames :)
 
On the game screen it just says RDNA4 graphics card. Benchmarks using the tool seem to be all over the place online as high as 120fps for the 7900XTX and around 100fps for the 7900XT

pricing is mostly what matters here. I still say AMD's problems with last gen was pricing.
Anyway, strange that only IGN got to report on this.
 
It makes them look good in benchmarks.
Nvidia is pushing the outdated "average FPS" nonsense that reviewers and gamers have moved on from decades ago.

Fake frames might look smooth, but they increase the latency and the game feels sluggish. I've used framegen on both Nvidia and AMD in plenty of titles, and while it looks superficially nice to run at 150fps instead of 90fps, the actual feel of the game suffers and once you see the fake-frame artifacts you simply cannot unsee them. It's like DLSS smear and ghosting - it's unacceptable once you know how much error and damage is being done to the image quality in pursuit of MOAR FPS.

All it really achieves is reducing the noise of raytracing even more, and that's really just a side effect of RTRT being too expensive for the hardware at the moment. It's all trickery designed to hide the problem rather than solve it - just like a magic trick, once you've spotted how it's done the illusion is gone and you're just left with obvious deception. I know there are plenty of people who don't care about temporal blurring caused by RT denoiser and temporal AA but it's a big enough issue that most reviewers focusing on image quality and plenty of other vloggers/posters have filled the internet with complaints and in-depth articles bemoaning their shortcomings.
 
It would be nice if I played the game so I could compare it to my 6750 XT... or if they'd tested a different game.

Has any one of you got the game by any chance?

open the reddit link, there's all sorts of users posting pics with their cards, including a 6950xt
 
open the reddit link, there's all sorts of users posting pics with their cards, including a 6950xt
Ah really, I see now. :)

Now all we need is performance data in other games, price and availability. So basically everything. :laugh:
 
Ah really, I see now. :)

Now all we need is performance data in other games, price and availability. So basically everything. :laugh:

The benchmark is online only as well... Last time I ran it when the game came out I was seeing a 10fps difference being on ethernet vs wifi.... If you look at their server MS it's 0 the best I can get mine down to is 10ms.
 
The benchmark is online only as well... Last time I ran it when the game came out I was seeing a 10fps difference being on ethernet vs wifi....
What is it doing being online during a benchmark run? :eek: Any game that's online only is a big no-no for me.
 
What is it doing being online during a benchmark run? :eek: Any game that's online only is a big no-no for me.

It's a MP benchmark lol also their vram is only at 8GB last I checked on extreme setting it eats up like 16GB.... So there is a lot of red flags. Unless settings were manipulated ofc
 
It's a MP benchmark lol also their vram is only at 8GB last I checked on extreme setting it eats up like 16GB.... So there is a lot of red flags. Unless settings were manipulated ofc

on the reddit pics i see 12 to 14, so it varies. It's still a big difference but could also be a very clean system. I always try to close everything when i game on my 8GB card and it makes a BIG difference, it goes to RAM but if the numbers are right it can easily be a 4GB difference or more if i leave things open.

we really need a tool to tell us what programs are using vram and how much, people would be surprised
 
Last edited:
on the reddit pics i see 12 to 14, so it varies. It's still a big difference but could also be a very clean system. I always try to close everything when i game on my 8GB card and it makes a BIG difference, it goes to RAM but if the numbers are right it can easily be a 4GB difference or more if i leave things open.

we really need a tool to tell us what programs are using vram and how much, people would be surprised

I tested it at balanced, ultra, extreme. it's 7/12/16 GB of vram. what your system is using is listed separately.


Extreme no vram limits

Screenshot (10).png

Extreme 50% limit
Screenshot (11).png

Ultra 50% limit.
Screenshot (12).png

Balanced was 6.9 so whatever settings they are using are between balanced and ultra..... But best to wait for real benchmarks.

The bar is red in the second screenshot because I capped to 12GB and it exceeded it.
 
I tested it at balanced, ultra, extreme. it's 7/12/16 GB of vram. what your system is using is listed separately.

Extreme no vram limits Extreme 50% limit Ultra 50% limit.
View attachment 378990View attachment 378986View attachment 378988

Balanced was 6.9 so whatever settings they are using are between balanced and ultra.....

The bar is red in the second screenshot because I capped to 12GB and it exceeded it.
It looks like you can't see your graphics settings anywhere on the results page. What an utter useless piece of crap! They call it a benchmark? :shadedshu:

I guess we actually learned nothing from the article. Typical IGN. :(
 
It looks like you can't see your graphics settings anywhere on the results page. What an utter useless piece of crap! They call it a benchmark? :shadedshu:

I guess we actually learned nothing from the article. Typical IGN. :(

I'm not saying it's wrong I just know the game uses more than 12GB of vram on extreme settings but less than 8Gb on balanced so whatever settings are being used can be anything in-between.
 
I'm not saying it's wrong I just know the game uses more than 12GB of vram on extreme settings but less than 8Gb on balanced so whatever settings are being used can be anything in-between.
I'm just saying that without knowing what graphics settings they used, IGN's 99 FPS means absolutely nothing.
 
I'm just saying that without knowing what graphics settings they used, IGN's 99 FPS means absolutely nothing.

Hey if it gets that close to a 4090 in cod Awesome but IGN isn't the best place to get information from.... The only reason I loaded up the game to run the benchmark was I remember seeing much higher vram usage...
 
Hey if it gets that close to a 4090 in cod Awesome but IGN isn't the best place to get information from.... The only reason I loaded up the game to run the benchmark was I remember seeing much higher vram usage...
I correct myself...
Screenshot_20250108_120838.png


I guess I'm really tired after work. Sorry, IGN. :D :ohwell: (although I agree, they're generally not the best info source)
 
That is what they are saying the problem is extreme uses more vram than 8GB I think the Min is around 13-14GB.
Maybe they took the screenshot with a different run? Or maybe the alpha/beta driver's VRAM detection doesn't work properly. I'm just shooting in the dark.
 
Maybe they took the screenshot with a different run? Or maybe the alpha/beta driver's VRAM detection doesn't work properly. I'm just shooting in the dark.

For sure can be anything I'm just saying take it with a huge grain of salt. It could be spot on but something seems off. It would still show as red because that is the game asking the gpu for the 13GB though.
 
Why is Nvidia spending more time and resources trying to generate guess-frames than just rendering more actual frames? maybe this makes sense to other people lol, I just don't get this focus.
They are not spending more time. But it's the only waily to saturare the modern 4k 240hz monitors. You can't do that with hardware alone, 4090 is down to the 20 fps range without those technologies, no matter how huge a boost the 50 series was it would still be lacking. Also, cpus, they are a major bottleneck in aaa titles and FG fixes that as well.
 
Welcome to the better (right) side of the TPU. :)
 
They are not spending more time. But it's the only waily to saturare the modern 4k 240hz monitors. You can't do that with hardware alone, 4090 is down to the 20 fps range without those technologies, no matter how huge a boost the 50 series was it would still be lacking. Also, cpus, they are a major bottleneck in aaa titles and FG fixes that as well.

It still sucks.... But that is literally only an issue in path tracing it's the only thing I even feel is worth using frame gen with although In some games it's pretty terrible. Silent Hill 2 has an awful implementation of frame gen.
 
Back
Top