• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What was lacking GPU-wise at this year's CES

I think the average buyer probably 80% of the market if not more is going to see these crazy frame generation numbers and think it's actual perfomance.
well that definitely happens on consoles but they are locked in in terms of how the developer wants the game to run

There has been talk on some of the MMO boards I'm on about up scaling tools although I would say most of the people I play with don't use them. Obviously those up scale tools create more latency which is no no in MMO performance. I honestly don't know what the average buyer thinks of it or if they even use it. Perhaps STEAM could do a survey.
 
Holy crap. I didn't actually run the CUDA percent increase numbers. Other than the 5090, there might not be any rasterization difference down the stack. This is a huge margin win for Nvidia as they are leaving the hardware mostly untouched and really only changing the software.

My biggest disappointment was AMD. Their CPU/APU launches at CES were excellent but they should have just left out RDNA4 instead of whatever the hell that was.

Intel's new CEOs already said they are going to slow down so I wasn't expecting anything after their December Battlemage launch but still it would have been nice to at least have a Battlemage recap and plug the upcoming B570.
There should be some improvement between architecture, move to G7 memory, move to slightly more enhanced 5nm node (N4P I think from N4 with Ada?), but notice their TDP figures also going up suggesting needing to pump more power. So...shouldn't be "the same", but I doubt it will see a huge difference in traditional rendering performance outside the 4090 to 5090.
 
i'm not an expert but from what i understood now we have 4 fake frames in the middle of the real frames, even if it looks impressive it's getting ridiculous. We'll soon have 1000 fps but most of it is fake.
 
If you played the AI drinking game your liver is shot and you have 30 minutes to live.
Unless you've been under a rock for the last couple of years, you probably would not have agreed to play the AI drinking game at CES.

Every product is called AI, is described as AI, was developed with AI, and will harness AI to do AI things, even if it's just a new pair of socks.
 
All three GPU manufacturers left out major information during CES this year:

Intel - released no GPU related news or products whatsoever
AMD - strangely announced RDNA4 with no pricing, specs or benchmarks (there was some strange graphic with boxes showing relative performance)
Nvidia - released no rasterization benchmarks (meaning without AI, RT and DLSS) and generally avoided a straight comparison with previous models and the competition. AI TOPS was the top line spec.

It's almost like none of these companies really want to be here in the discrete consumer GPU space.

PS. This is my first forum post so I hope it follows the rules and is in the right channel.
I agree. This CES was poor from all three manufacturers GPU-wise.

Why is Nvidia spending more time and resources trying to generate guess-frames than just rendering more actual frames? maybe this makes sense to other people lol, I just don't get this focus.
It makes them look good in benchmarks.
 
Epitome of 'Stop Gap' generation it seems for the 50 series, besides the 5080 and 5090. While I think the aesthetic look of the FE cards is a improvement (I hate how the FE cards look for the 30 and 40 series), on the actual stuff that matters, It just looks very... meh. No reason for me really be excited for 50 series. I didn't really care for 40 series either until the supers dropped, maybe NVIDIA is gonna do that again. Who knows. 5060ti and 5060 might turn my opinion around though, assuming of course NVIDIA takes examples, but they probably wont. To be fair, raw performance is what I care about most and we don't have exact raw performance numbers yet. Hoping the AI stuff isn't all the hype that 50 series is relying on.

AMD did their thing; and were probably waiting for the prices of the 5070 or 5070ti to drop so they can undercut them. It might go decently for them. I don't expect AMD's cards to beat the 5070 and probably not the 5070ti either though. I hope they do a Polaris kind of strategy. Don't give a flying #### about their AI stuff though. No offense, AI enjoyers.

Intel did not show so I have nothing to say about them.

Honestly I just feel meh. Literally have almost nothing to be excited about. RX 90xx series sounds neat but unless they do some crazy stuff with their price points or FSR4 I'm not really too stoked about them.

I would say actual apples to apples benchmarks were lacking but even had they shown them i would have taken them with a huge grain of salt even last generation the 40 series was mostly decent uplifts
40 series initially? Yea, kind of. The 40 series after the supers dropped did redeem it imo. I think the 4060 should of been our first warning of the focus on AI increasing more and more.
 
Last edited:
I feel for the Partners that showed off the new AMD cards at the show, AMD fumbled and made the bare minimum announcement so that they weren't just left in the cold having to put them away, just displaying them with no specs or price at all, awkies.

No Intel cards at all.

Focus on AI from Nvidia (what did we expect), but at least most product prices weren't higher than last gen and the improvements to DLSS features across all gens are genuinely exciting.
 
Why is Nvidia spending more time and resources trying to generate guess-frames than just rendering more actual frames? maybe this makes sense to other people lol, I just don't get this focus.
What's not to get? Higher FPS via faster hardware is:
1. Expensive
2. A game their competitors can play too
2a. When all competitors are playing the same game, you have to compete on price, which eats into margins.

Meanwhile higher FPS via software tricks doesn't increase board cost, and your special sauce is exclusively yours. Competitors can try to replicate it, but Nvidia's dominant market share means theirs will always be adopted by games while AMDs and Intel's may or may not be.
 
What's not to get? Higher FPS via faster hardware is:
1. Expensive
2. A game their competitors can play too
2a. When all competitors are playing the same game, you have to compete on price, which eats into margins.

Meanwhile higher FPS via software tricks doesn't increase board cost, and your special sauce is exclusively yours. Competitors can try to replicate it, but Nvidia's dominant market share means theirs will always be adopted by games while AMDs and Intel's may or may not be.
Except that AMD owns the console market so their software is very relevant. Other than that, you're right.
 
Last edited:
The 5090 looks great, MSI have decided that Intel and Nvidia are their future and AMD is either fumbling or sandbagging in the DGPU space. Too bad Acer lost the plot and forgot they were the value option as their handhelds are too expensive. I did like the Strix laptop with just an APU inside and it looks like the B650E E Strix is a great replacement for the B550-XE. Now we have way more choice for AM5 too but this was the first CES in a while where I watched no Keynotes.
The MSI AMD cards were usually mediocre anyway, MSI just lazily reused the Nvidia coolers. MSI still makes decent AMD motherboards, though all of the marketing focus seems to be on the Intel side, I wouldn't be surprised if MSI goes completely to Intel & Nvidia.
Hopefully AMD just fumbled a bit with pricing, and they decided to sandbag on releasing specs. AMD always manages to fumble on something, but they really need to work on marketing if it was just a last minute pricing change.
I haven't watched any keynotes either, it all seems to be press coverage for AI, I just watched some vids from tech YouTubers.
$1100 US is for poor people?
According to the leather jacket man, the 5090 is for people with $10,000 PC's.
 
Nvidia left out information to avoid leaks before Nvidia keynote. Even with NDA in place, things would get out.

AMD did a silent launch. It's funny we saw the 90 cards and no one even mentioned them to us. Just sitting on the table.
 
AMD did a silent launch. It's funny we saw the 90 cards and no one even mentioned them to us. Just sitting on the table.
Silent launch? What does that mean?
 
Silent launch? What does that mean?

To technically launch a product but then basically say nothing about it. I do wonder if their partners are like WTF.
 
To technically launch a product but then basically say nothing about it. I do wonder if their partners are like WTF.
Everybody is like WTF. If a product isn't available to buy, then it hasn't launched yet, right?
 
Everybody is like WTF. If a product isn't available to buy, then it hasn't launched yet, right?

No, take the 4090 for example the release date is technically Sep 20th but availability was Oct 12th.... Usually the date first shown especially a finished product like all the AIB models etc is counted as the release date I believe.

But basically what Ir is saying is they showed off basically everything but didn't say anything.. While leaving out all the important parts like price/performance.
 

What was lacking GPU-wise at this year's CES


AMD
 

What was lacking GPU-wise at this year's CES


AMD

They actually have a ton of Gpus at ces they just don't want to talk about them they are being shy don't bully them...
 
They actually have a ton of Gpus at ces they just don't want to talk about them they are being shy don't bully them...
It doesn't spark much confidence to potential buyer, I know I am. Felt weird and something is fishy, either performance isn't up to par or wanted to play around with what price range people will buy.
 
not sure if it was already mentioned here

rx 9070 benchmarked with COD

7900xtx gets 108fps in comparison
So the 9070 (non-XT) is close behind the 7900 XTX in CoD Black Ops 6. That's very interesting.

Although, I don't play the game, but I've heard that it heavily favours AMD hardware, so I'm not really sure how this compares on a broader spectrum.
 
So the 9070 (non-XT) is close behind the 7900 XTX in CoD Black Ops 6. That's very interesting.

Although, I don't play the game, but I've heard that it heavily favours AMD hardware, so I'm not really sure how this compares on a broader spectrum.

Guessing its a typo and its the 9070XT if so that would be decent although cod runs well on amd hardware much better than the average game.
 
Guessing its a typo and its the 9070XT if so that would be decent although cod runs well on amd hardware much better than the average game.
The article mentions "9070" without XT multiple times. It never mentions "XT". It also says that it has an unknown number of cores, while we all know that the 9070 XT has 4096.
 
The article mentions "9070" without XT multiple times. It never mentions "XT". It also says that it has an unknown number of cores, while we all know that the 9070 XT has 4096.

I don't think there are any 9070s at ces period it would also be strange to demonstrate the weaker card.
 
Back
Top