Wednesday, December 4th 2024

AMD Announces Press Conference at CES 2025

AMD today announced that it will be hosting a press conference as a part of the official CES 2025 Media Days. The press conference will take place on Monday, Jan. 6 at 11 a.m. PST at the Mandalay Bay. AMD senior vice president and general manager of the Computing and Graphics Group Jack Huynh, along with other AMD executives will be joined by partners and customers to discuss how AMD is expanding its leadership across PCs and gaming, and highlight the breadth of the company's high-performance computing and AI product portfolio. The keynote will be available to livestream on the AMD YouTube Channel, with replay available after the conclusion of the livestream event.
Add your own comment

26 Comments on AMD Announces Press Conference at CES 2025

#1
Bwaze
"to discuss how AMD is expanding its leadership across PCs and gaming"

And will they be commenting on why they are almost pulling out of GPU sector, abandoning high end and trailing behind Nvidia more and more with each generation?
Posted on Reply
#2
Quicks
Bwaze"to discuss how AMD is expanding its leadership across PCs and gaming"

And will they be commenting on why they are almost pulling out of GPU sector, abandoning high end and trailing behind Nvidia more and more with each generation?
Probably to discuss their new GPU series and release date. They only need 3 good GPU's at excellent price and they will coin it.
Posted on Reply
#3
wolf
Better Than Native
I'm hoping to not only hear about specific products, but the features and strategy surrounding gaming. For example the push for RT performance and FSR4. Hopefully it's really interesting and informative!
Posted on Reply
#4
Bwaze
I know AMD has it's reasons for the lack of focus on gaming GPUs, from clearly giving priority first to server CPUs, secondary to PC CPUs (that are built on same architecture as server, so they're developed concurrently), and only then comes the rest. That they plan to completely abandon higher end GPU cards was a press release in September 2024, not some conjecture.

AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

The results of previous decisions are known, only a small discrete GPU market share, and even that thin sliver is threatened by Intel. I can't imagine abandoning even the pursuit of higher end enthusiast market will have different conclusion.
Posted on Reply
#5
Count von Schwalbe
Nocturnus Moderatus
BwazeThe results of previous decisions are known, only a small discrete GPU market share, and even that thin sliver is threatened by Intel. I can't imagine abandoning even the pursuit of higher end enthusiast market will have different conclusion.
Less R&D and overhead from a smaller product stack might allow for a more aggressive pricing strategy. My hope at least.
Posted on Reply
#6
Bwaze
Count von SchwalbeLess R&D and overhead from a smaller product stack might allow for a more aggressive pricing strategy. My hope at least.
My fear is that this isn't even sought - that they are in position they plan to be, since AMD's fabrication is limited, and increase in GPU output would eat into production of items that have higher priority and profit.

Also, Nvidia could technically match and beat any AMD's pricing war, since they have all the revenue they want from AI server market, and Gaming is just an afterthought now - a sector with fake inflated revenue increase, so to mask that all income is again coming from single hype, as it happened during latest cryptomadness (and where court judged that they are free to fake their financial reports as they see fit).
Posted on Reply
#7
Marcus L
BwazeI know AMD has it's reasons for the lack of focus on gaming GPUs, from clearly giving priority first to server CPUs, secondary to PC CPUs (that are built on same architecture as server, so they're developed concurrently), and only then comes the rest. That they plan to completely abandon higher end GPU cards was a press release in September 2024, not some conjecture.

AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

The results of previous decisions are known, only a small discrete GPU market share, and even that thin sliver is threatened by Intel. I can't imagine abandoning even the pursuit of higher end enthusiast market will have different conclusion.
AMD also have Ryzen CPU's which are outselling Intel by double digits in most popular online retailers and have been doing for the last 2 years or so, have the gaming and efficiency crown, don't burn themselves out, and have clawed back market share from low 10's to almost 40v60% marketshare, obviously they can't do this in the dGPU market :rolleyes: , APU's, console market, handhelds, servers and super computers aside.... so yes they do have a leadership in PC and gaming, it's not all about who has the bestest power hungry 500w+ £2.5K GPU that accounts for 1% of PC gamers, but it's this attitude and people who will blindly pay whatever the competition charges that will keep this charade going, £800 for a mid-range GPU aka 4070/Ti/S is nuts, rather than waiting for AMD to release something that competes for 1/2 the price to see if NV will lower their prices how about support the company and stop giving your money to the other company who don't give a shit and know that most sheep will buy whatever guff they release at over inflated prices with low vRAM/bus etc, only then will they rethink, until then expect to keep paying $400 for a 50 class GPU upon rlease that is rebadged 60 class and get squeezed for every last £ in every single tier now, it's frankly sikening and another reason I will refuse to pay the NV tax, they can shove their RT and DLSS where the sun doesn't shine, if it can't do it without resorting to fake FPS improvements is where we need to be back to IMO
Posted on Reply
#8
Bwaze
Marcus L...and have clawed back market share from low 10's to almost 40v60% ...
So no leadership even where they're supposedly outselling their competitor by double digits?
Marcus L,APU's, console market, handhelds, servers and super computers aside.... so yes they do have a leadership in PC and gaming,
Above you yourself correctly shown they're not in leadership position in PC. And consoles and handhelds are made by other makers who pocket most of the profit, AMD gets relatively small share for all that "leadership". And server has absolutely no connection with gaming.
Marcus Land another reason I will refuse to pay the NV tax, they can shove their RT and DLSS where the sun doesn't shine, if it can't do it without resorting to fake FPS improvements is where we need to be back to IMO
You do know we have lots of games now that require you to use "fake FPS" even without any ray-tracing or other compute intensive modern "Nvidia only, because others can't run it"" effects, on any card, even the top money Nvidia monsters. And reason is usually they are relatively bad conversions from consoles - that use AMD tech and depend on "faking FPS" even there
Posted on Reply
#9
Count von Schwalbe
Nocturnus Moderatus
BwazeMy fear is that this isn't even sought - that they are in position they plan to be, since AMD's fabrication is limited, and increase in GPU output would eat into production of items that have higher priority and profit.
OTOH the lower-end GPUs aren't huge. Not that much in terms of wafer space. They also could be in a position to force Nvidia to either lose money on that entire product stack or lose market share. It looks like Intel is trying to do that one.
Posted on Reply
#10
Marcus L
BwazeSo no leadership even where they're supposedly outselling their competitor by double digits?



Above you yourself correctly shown they're not in leadership position in PC. And consoles and handhelds are made by other makers who pocket most of the profit, AMD gets relatively small share for all that "leadership". And server has absolutely no connection with gaming.



You do know we have lots of games now that require you to use "fake FPS" even without any ray-tracing or other compute intensive modern "Nvidia only, because others can't run it"" effects, on any card, even the top money Nvidia monsters. And reason is usually they are relatively bad conversions from consoles - that use AMD tech and depend on "faking FPS" even there
Leadership when they are outselling for the last few years, they can't magically go from 10's to over 50% marketshare, so yes, leadership over Intel in the CPU space which is only getting larger, consoles and handhelds are made by other makers? so are desktops and servers but they still use AMD CPU/APU's? what's your null point?

No you don't have to use "fake frames" we are entitiled and people think they have to run games at the "ultra" preset at 120-240FPS at 1440p/4K resolutions, it's entitlement, rather than turn it down to high settings which has almost 0 visual/fidelity difference for 30%+- more FPS, it's why morons are willing to fork out $1200+ on a GPU to run unoptimised UE5 engine games to make them selves feel better and say "hey, look at me, look at what my gamingRGBFTW super expensive gaming PC can do compared to yours, rather than taking 5 minutes out of their lives to olptimise the damn settings to math their hardware
Posted on Reply
#11
Bwaze
Marcus LLeadership when they are outselling for the last few years, they can't magically go from 10's to over 50% marketshare, so yes, leadership over Intel in the CPU space which is only getting larger,
It's market share in units sold "right now", not share of all the PCs out there, even the outdated office computers from 90' and early 2000'. So, no leadership. Unless you redefine the meaning of the word.

"Ryzen CPU's outselling Intel by double digits in most popular online retailers and have been doing for the last 2 years or so" is just a very specific market - DIY PC builders that follow technology and are educated enough. But most PCs sold are still pre built, even if you look at gaming PCs - most people don't assemble their own computers. And there you still have strong Intel presence, due to various factors, not at least anti competitive illegal practices Intel has been doing for decades, was found guilty and simply not paid the fine.

And then there's notebooks, where AMD had great troubles convincing makers to offer computers with their CPUs, there is no DIY market in that.
Marcus Lconsoles and handhelds are made by other makers? so are desktops and servers but they still use AMD CPU/APU's? what's your null point?
For instance both CPU and GPU for Playstation 5 are technically AMD, but that doesn't mean AMD sold 65 million custom 8 core Zen 2 CPUs and 65 million custom RDNA 2 GPUs for Playstation 5 - they developed them in collaboration and to Sony specification, and Sony bought the licence and had them made. This is of course great difference, revenue from handhelds and consoles is reported in AMD financial reports, and is relatively small.
Posted on Reply
#12
LittleBro
Bwaze"to discuss how AMD is expanding its leadership across PCs and gaming"

And will they be commenting on why they are almost pulling out of GPU sector, abandoning high end and trailing behind Nvidia more and more with each generation?
Not introducing enthusiast-grade GPUs every generation is absolutely not "almost pulling out of GPU sector".
RDNA is used in most of gaming consoles these days, mainly Xboxes and Playstations. Actually, in handheld sector AMD is now dominant, too.

As for trailing Nvidia, while it may be true, it all depends on how you look at it. E.g. RX 7900 XTX has 15% less perf. than RTX 4090 but costs 40% less.
Nvidia has edge in RTX and DLSS (although I personally condemn image distortion and fake frames stuff), AMD has edge in pure rasterization. Who is trailing the other depends on what a user prefers.

Just for clarification, Nvidia holds maybe more than 50% TSMC's capacity in current and next process node allocation, next are Apple, Intel, Qualcomm, ... Even if AMD wanted to make more chips, there is no capacity (TSMC's capacity is fully booked until 2027), Nvidia focuses on so called AI and money is already paid in advance. TSMC surely loves Nvidia's money, AMD surely can't pay that much money and not in advance. So, producing more smaller chips than fewer bigger chips enables AMD to get more out of TSMC's capacity and selling them at reasonable margin might be a good idea and might yield better overall earnings. Sometimes less is more.
Posted on Reply
#13
TSiAhmat
LittleBroNot introducing enthusiast-grade GPUs every generation is absolutely not "almost pulling out of GPU sector".
RDNA is used in most of gaming consoles these days, mainly Xboxes and Playstations. Actually, in handheld sector AMD is now dominant, too.

As for trailing Nvidia, while it may be true, it all depends on how you look at it. E.g. RX 7900 XTX has 15% less perf. than RTX 4090 but costs 40% less.
Nvidia has edge in RTX and DLSS (although I personally condemn image distortion and fake frames stuff), AMD has edge in pure rasterization. Who is trailing the other depends on what a user prefers.

Just for clarification, Nvidia holds maybe more than 50% TSMC's capacity in current and next process node allocation, next are Apple, Intel, Qualcomm, ... Even if AMD wanted to make more chips, there is no capacity (TSMC's capacity is fully booked until 2027), Nvidia focuses on so called AI and money is already paid in advance. TSMC surely loves Nvidia's money, AMD surely can't pay that much money and not in advance. So, producing more smaller chips than fewer bigger chips enables AMD to get more out of TSMC's capacity and selling them at reasonable margin might be a good idea and might yield better overall earnings. Sometimes less is more.
something i wanted to add:

In my Region the cheapest 4090 (in the last year) was 1719,99 €
the cheapest 7900 xtx was 879,00 €

that's basically double the price & in my book a easy choice for the 7900 xtx.

BUT:

there's also the 4080S which in current games has a very similar performance for 999,00 €

So... i have to ask myself is saving 130 € worth it.

Personally would still go for the 7900 xtx, reason is curiosity with amd cards (never had one)

But most customers won't think that way...


I think currently the pain points for AMD GPUs are:

The MSRP at the beginning is most of the time to high -> Bad Reviews
The perception of the Gpu driver support/stability (that's a hard one to improve)
RT as a whole is rather weak on the AMD side (personally don't really care)
FSR is most of the time quite inconsistent, some of the games look quite similar to Dlss. In a lot of them it looks worse.
Quite a lot of people don't even consider AMD as an option
Posted on Reply
#14
Space Lynx
Astronaut
Would love for Lisa Su to get up there and be like 'lulz got ya, here is the 8900 xtx and it destroys the 4090, have fun gamers'


let me have my dreams lads, let me have them. :roll:
Posted on Reply
#15
LittleBro
TSiAhmatthere's also the 4080S which in current games has a very similar performance for 999,00 €
I'd go for 7900 XTX because of VRAM and lower price.
TSiAhmatThe MSRP at the beginning is most of the time to high -> Bad Reviews
This also applies to Nvidia. For instance, RTX 4080 had MSRP of $1200. With 4080 Super (+1% perf. over RTX 4080) Nvidia realized it's way too much and lowered MSRP to $1000.
TSiAhmatThe perception of the Gpu driver support/stability (that's a hard one to improve)
I rarely have problems with drivers on AMD GPU. Maybe I'm lucky, I don't update drivers every time they release new version.
I only update when they address serious bug that concerns me or because of new game. I only updated twice in my life because of bugs.

Any Linux user will tell you to choose AMD over Nvidia because of support and drivers. What I personally hated about Nvidia drivers since GTX 600 series was forced telemetry. I hate espionage, this is what Microsoft does regularly. Telemetry should be disabled by default and turned on only with explicit approval of user.
TSiAhmatRT as a whole is rather weak on the AMD side (personally don't really care)
Me neither. Sure it's nice, it improves realism but the performance cost is immense. Please don't tell me to turn upscaling on. I don't enable feature that improves image and than turn on DLSS that cripples image to make framerate smooth. It's contraproductive. I rather choose to not use RT at all.
TSiAhmatFSR is most of the time quite inconsistent, some of the games look quite similar to Dlss. In a lot of them it looks worse.
I play only on native resolution, not to be distorted or changed artificially. In other words, I play as game was meant by devs to be played. God, I sound as Nvidia.
TSiAhmatQuite a lot of people don't even consider AMD as an option
That has historical reasons. I don't want to discuss this here but you can read other threads and make your own opinion.
Posted on Reply
#16
TSiAhmat
LittleBroThis also applies to Nvidia. For instance, RTX 4080 had MSRP of $1200. With 4080 Super (+1% perf. over RTX 4080) Nvidia realized it's way too much and lowered MSRP to $1000.
Yes but they don't really have to care about reviews, almost everyone knows them. A bad review hurts amd a lot more.
LittleBroI rarely have problems with drivers on AMD GPU. Maybe I'm lucky, I don't update drivers every time they release new version.
I only update when they address serious bug that concerns me or because of new game. I only updated twice in my life because of bugs.

Any Linux user will tell you to choose AMD over Nvidia because of support and drivers. What I personally hated about Nvidia drivers since GTX 600 series was forced telemetry. I hate espionage, this is what Microsoft does regularly. Telemetry should be disabled by default and turned on only with explicit approval of user.
I am not saying it is bad, I am just saying the perception of amd drivers are a lot worse, in comparison to nvidia.

I have friends with a 7900 xtx and 7800 xts with no complaints till this day. (because of drivers or something similar) But that doesn't invalidate the experience of others.

Also heard nvidia drivers for linux are quite good (even if they aren't open source)
Still same opinion as you on telemetry.
LittleBroI play only on native resolution, not to be distorted or changed artificially. In other words, I play as game was meant by devs to be played. God, I sound as Nvidia.
Similar to me, i personally try to play on native resolution (if upscaling looks bad) but sometimes i can't tell the difference.
Posted on Reply
#17
Neo_Morpheus
Bwaze"to discuss how AMD is expanding its leadership across PCs and gaming"

And will they be commenting on why they are almost pulling out of GPU sector, abandoning high end and trailing behind Nvidia more and more with each generation?
Nice try but no, AMD wont make Ngreedia lower their prices since you will end up giving your money to Ngreedia anyways.
Space LynxWould love for Lisa Su to get up there and be like 'lulz got ya, here is the 8900 xtx and it destroys the 4090, have fun gamers'


let me have my dreams lads, let me have them. :roll:
And yet the sheep will still buy Ngreedia.

I wouldn’t blame them if they abandoned the gamers market and concentrate in servers and consoles since as you can see, the second that their gpus are mentioned, the negative comments start flowing, with the usual “lack of this or lack of that” to the nonsense of bad drivers.
Posted on Reply
#18
DaemonForce
I expect the price to be within reason.
Market says $390-440 is the correct range for current gen x800 XT series and -$10 on previous gen.
So naturally I would expect $400-450 to be the hot sale point for 8800 XT, no idea if we'll see it right away.
If there's minimal uplift like between 6800 XT and 7800 XT then I hope the features are much better.

Great coolers like the Hellhound, better VRM, improvements to h264/hevc/av1 (dual) encodes, Win10 driver compatibility, maybe more dual HDMI.
We are all expecting fair raster improvements but if 6000->7000 is any example, that compass is still spinning wildly.
There really isn't a whole lot I look for in a new video card so I'm not holding out for any miracles.
BUT...If the 8800XT can deliver a solid 1440p120 experience in DX12 games that would be HUGE.

A quick reminder of where things stand:


So....Yeah. Not that hopeful. Either performance isn't great and we get a repackaged 7900XT or the card is marginally better with price coming unglued from reality.
Which way Western rabbit? :wtf:
Posted on Reply
#19
AnotherReader
DaemonForceI expect the price to be within reason.
Market says $390-440 is the correct range for current gen x800 XT series and -$10 on previous gen.
So naturally I would expect $400-450 to be the hot sale point for 8800 XT, no idea if we'll see it right away.
If there's minimal uplift like between 6800 XT and 7800 XT then I hope the features are much better.

Great coolers like the Hellhound, better VRM, improvements to h264/hevc/av1 (dual) encodes, Win10 driver compatibility, maybe more dual HDMI.
We are all expecting fair raster improvements but if 6000->7000 is any example, that compass is still spinning wildly.
There really isn't a whole lot I look for in a new video card so I'm not holding out for any miracles.
BUT...If the 8800XT can deliver a solid 1440p120 experience in DX12 games that would be HUGE.

A quick reminder of where things stand:


So....Yeah. Not that hopeful. Either performance isn't great and we get a repackaged 7900XT or the card is marginally better with price coming unglued from reality.
Which way Western rabbit? :wtf:
Those figures don't look reliable. In what world is the 7900 XT only 5% faster than the 6800 XT in DirectX 12 games? As for the price point, given that the 7900 XT hasn't gone much below $550 even on Black Friday, I expect prices to be in the $549 to $599 range.
Posted on Reply
#20
DaemonForce
AnotherReaderThose figures don't look reliable. In what world is the 7900 XT only 5% faster than the 6800 XT in DirectX 12 games?
The measurement is pulled from submissions to the Passmark G3D Mark bench suite. I prefer to directly compare the results.
So far it has been a really good tell for most cards on the list. Still, there are some weird outliers just like there are weird users.
AnotherReaderAs for the price point, given that the 7900 XT hasn't gone much below $550 even on Black Friday, I expect prices to be in the $549 to $599 range.
The only card I've seen go below that is the XFX Speedster MERC310 around $530. The rest bottomed out around $625 so that might be it.
Posted on Reply
#21
AnotherReader
DaemonForceThe measurement is pulled from submissions to the Passmark G3D Mark bench suite. I prefer to directly compare the results.
So far it has been a really good tell for most cards on the list. Still, there are some weird outliers just like there are weird users.

The only card I've seen go below that is the XFX Speedster MERC310 around $530. The rest bottomed out around $625 so that might be it.
$530 is really good, but that makes me more certain that a price of $499 is very unlikely. As for Passmark, you would be better off using TPU's recent reviews. According to the most recent GPU review, the 7900 XT is 35% faster than the 6800 XT at 1440p; at 4K, the margin widens to 39%.
Posted on Reply
#22
wolf
Better Than Native
LittleBroI play as game was meant by devs to be played
This one gets me, as if Native resolution of any given panel, on any given setup is the way the devs want you to play the game, and any other way absolutely isn't. To each their own but what a hill to die on. Never mind the other myriad of tricks, hacks and shenanigans to present what's up on the screen. I respect personal preference but I can't say I agree with the rationale.
Posted on Reply
#23
Kyan
LittleBroI play only on native resolution, not to be distorted or changed artificially. In other words, I play as game was meant by devs to be played. God, I sound as Nvidia.
I would not play as game was meant to be played by the devs recently after this video :(

The thumbnail don't look great but the content is good and well explained by a game dev

Posted on Reply
#24
LittleBro
DaemonForceBUT...If the 8800XT can deliver a solid 1440p120 experience in DX12 games that would be HUGE.:wtf:
That would be miracle, not huge. :D Even RTX 4090 can't do 1440p @ 120 FPS on native in some newer games.
wolfThis one gets me, as if Native resolution of any given panel, on any given setup is the way the devs want you to play the game, and any other way absolutely isn't. To each their own but what a hill to die on. Never mind the other myriad of tricks, hacks and shenanigans to present what's up on the screen. I respect personal preference but I can't say I agree with the rationale.
If a studio plans to release game that is meant to be played with DLSS/FSR/XeSS, it may be taken into account when graphics assets are being made for the game. There's no point in putting extra effort to work out ultra fine textures or other graphics materials, because god only knows how that particular upscaling algorithm will render that particular pixel in that particular situation. It may look totally different than original. Look at Wukong screenshots. That game is extremely taxy on GPU and basically unplayable without upscaling, while many other games look better (at least texture-wise) and offer much smoother gameplay on native.
Posted on Reply
#25
wolf
Better Than Native
LittleBroIf a studio plans to release game that is meant to be played with DLSS/FSR/XeSS, it may be taken into account when graphics assets are being made for the game. There's no point in putting extra effort to work out ultra fine textures or other graphics materials, because god only knows how that particular upscaling algorithm will render that particular pixel in that particular situation. It may look totally different than original. Look at Wukong screenshots. That game is extremely taxy on GPU and basically unplayable without upscaling, while many other games look better (at least texture-wise) and offer much smoother gameplay on native.
All I can say here is I strongly disagree, I've never found upscaling (most notably DLSS) to alter the image in any significant way from the beginning, no difference to assets or textures, no alteration to the art or style and feel of the game at all. Native is such an arbitrary term too, as it applies uniquely to every monitor. For example take running a game on a steam deck at 1280x800, then on a 4k monitor using DLSS quality, the 800p "Native" presentation sure ain't going to look better, or more authentic, not from the heap of testing and experience I have at least. Like you say however this might be taken into account during development when a game intends to ship with upscaling options, which most AAA/heavy games do anyway.

I've always subscribed to the notion that what the developer wants is for everyone to get the best possible experience playing their game, there's no one size fits all config or way to run it 'as the devs meant for it to be played' - they give you as many options as possible so that from steam decks to halo part builds we can all enjoy it the best we can.
Posted on Reply
Add your own comment
Dec 11th, 2024 20:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts