Monday, October 14th 2024

AMD to Reduce RDNA 4 "Navi 44" Chip Package Size

GPU chip packages of the "Navi 4x" generation of GPUs could be generationally smaller than their predecessors, according to leaked package dimensions of the "Navi 44" chip put out by Olrak29_. With its next-generation Radeon RX gaming GPUs based on the RDNA 4 graphics architecture, AMD has decided to focus on gaining market-share in the performance and mainstream segments, ceding the enthusiast segment to NVIDIA. As part of its effort, the company is making RDNA 4 efficient at every level—architecture, process, and package.

At the architecture level, RDNA 4 is expected to improve performance, particularly the performance cost of ray tracing, through a more specialized ray tracing hardware stack. At the process level, AMD is expected to switch to a more efficient foundry node, with some reports suggesting the TSMC 4 nm, such as the N4P or N4X. For a mid-range GPU like the "Navi 44," which succeeds the "Navi 23" and "Navi 33," these mean a rather big leap from the 7 nm or 6 nm DUV nodes. The leak suggests a smaller package, measuring 29 mm x 29 mm. In comparison, the "Navi 23" package measures 35 mm x 35 mm. The smaller package could make these GPUs friendlier with gaming notebooks, where mainboard PCB real-estate is at a premium.
Source: Olrak29_ (Twitter)
Add your own comment

41 Comments on AMD to Reduce RDNA 4 "Navi 44" Chip Package Size

#1
Launcestonian
This will be interesting when retail products launch & are bench tested. I would still consider RDNA 4 upgrade if & only if the power efficiency in relation to performance of a reference 7900 XTX is competitive.
Posted on Reply
#2
eidairaman1
The Exiled Airman
Who is olrak_29? I have no twitter/X account and never will
Posted on Reply
#3
Solaris17
Super Dainty Moderator
eidairaman1Who is olrak_29? I have no twitter/X account and never will
It doesnt even matter, people idolize these people as prophets. Back in the day you would be lucky if the account got sourced while people in forums argued about if the information was good. Now people hang off there every word. Crazy.
Posted on Reply
#4
Zazigalka
ray tracing on entry level cards is useless. change my mind.
Posted on Reply
#5
Dirt Chip
The proper strategy for AMD imo, just like 3 gens ago with the 5xxx.
Shame thay zig-zag that often- thay might have succeed by now. Instead they try to fight the (stupid) top-end RT battle against NV who wrote the rules. Winless battle.

Focus on raster only, lots or RAM and stay 20-30% cheaper than NV. You will gain your market share in one gen time.
Posted on Reply
#6
DaedalusHelios
Dirt ChipThe proper strategy for AMD imo, just like 3 gens ago with the 5xxx.
Shame thay zig-zag that often- thay might have succeed by now. Instead they try to fight the (stupid) top-end RT battle against NV who wrote the rules. Winless battle.

Focus on raster only, lots or RAM and stay 20-30% cheaper than NV. You will gain your market share in one gen time.
Isn't that what the prices already show? Or are we talking MSRP at launch?
Posted on Reply
#7
Dirt Chip
DaedalusHeliosIsn't that what the prices already show? Or are we talking MSRP at launch?
At launch, thay must have the 'wow' factor of low cost because thay don't have the 'wow' (as if..) of the top end max preference.
Intel can and should do the same btw.
Posted on Reply
#8
Minus Infinity
Zazigalkaray tracing on entry level cards is useless. change my mind.
Yet Nvidiots and the media will crucify them if they don't offer it.
Posted on Reply
#9
Beginner Macro Device
Zazigalkaray tracing on entry level cards is useless. change my mind.
Not to change your mind but RT is just a part of architecture. Building two different arches for one generation, one with RT and another sans, is expensive and doesn't make a lot of sense so all cards have the same % of RT cores. Useless or not, this is how it's gonna work.

Still, one might use light RT effects if they are content with low FPS, or do it in older games where even an RX 7600 will suffice, or only enable RT for screenshots etc.
btarunrTSMC 4 nm
This won't let anything better than parring Ada in FPS per W happen but... well, there's no "but."
Posted on Reply
#10
Chomiq
Minus InfinityYet Nvidiots and the media will crucify them if they don't offer it.
I mean, they got some sort of raytracing hardware on PS5 Pro and that's suppose to use RDNA4 elements.
Posted on Reply
#11
AusWolf
A similarly equipped GPU made on a smaller manufacturing node ends up being smaller. Go figure! :rolleyes:
Posted on Reply
#12
DaedalusHelios
Dirt ChipAt launch, thay must have the 'wow' factor of low cost because thay don't have the 'wow' (as if..) of the top end max preference.
Intel can and should do the same btw.
Top end price/performance ratio is terrible with Nvidia right now. $2.2k USD for what? 10% performance over cards $1.4k cheaper. RTX 4090 cards are getting a little close to the new card's release date too. They need to move that inventory. The sales numbers right now show that pricing is too inflated.
Posted on Reply
#13
Apocalypsee
RT performance needs to be less than 50% of impact from raster if they wanted to get competitive. Here's hoping that smaller die sizes mean less wattage, just don't pull up 7600 to 7600XT stunt by clocking mere 100MHz but higher TDP bullshit
Posted on Reply
#14
DaedalusHelios
ApocalypseeRT performance needs to be less than 50% of impact from raster if they wanted to get competitive. Here's hoping that smaller die sizes mean less wattage, just don't pull up 7600 to 7600XT stunt by clocking mere 100MHz but higher TDP bullshit
For the market that really needs ray tracing, sure. More people care about FSR and DLSS advancement in regards to quality since that is where most people will be seeing gains. Most income comes from midrange cards last time I checked, and ray tracing performance is going to be a slideshow level fps for all modern games turned up on a midrange card right now. Flagship card ownership is fairly rare.
Posted on Reply
#15
Vayra86
Dirt ChipThe proper strategy for AMD imo, just like 3 gens ago with the 5xxx.
Shame thay zig-zag that often- thay might have succeed by now. Instead they try to fight the (stupid) top-end RT battle against NV who wrote the rules. Winless battle.

Focus on raster only, lots or RAM and stay 20-30% cheaper than NV. You will gain your market share in one gen time.
They're just gonna do another Polaris and Nvidia will have no issue undercutting them if they do succeed at midrange.

And then in two years time we wonder why they can't compete even with an x70.

This not doing high end is a loser's strategy. It never lasts. The only logical result is AMD will zig zag back into high end costing them another 5-6 years.

The whole reason Nvidia dominates now is because they kept progressing while AMD was rebranding Hawaii XT and the 7970 for the second time, and then fumbled onwards into Fury X and Vega, only to arrive at RDNA1 far too late to really get RDNA2 going with RT proper. At that point their focus was still on the basic architecture, RT was 'a bonus feature'. RDNA3: focus on chiplet, RT was an afterthought. They're just too slow, and throughout all of this, the elephant in the room is not that you need to keep making last gen's performance GPUs, you need to progress your stuff ever forward. Every time AMD practically ground to halt, they lost market share. Polaris's revenue success came not from gaming, but from crypto.

Intel isn't really clawing in market share either with their A770, despite being dirt cheap.
Posted on Reply
#16
_roman_
Zazigalkaray tracing on entry level cards is useless. change my mind.
That's the hole point. That's why I did not care for raytracing with my Radeon 7800XT.

Those two free AMD trash game free giveaway as I purchased Ryzen 7000 cpu / gpus had raytracing. Just bad implementation or lightning. It was not worth the wasted 70Watts for the gpu according to the windows 11 pro graphic card driver. I do not see a difference / an improvment with raytracing enabled. The game still looks like some nonsense. Best example is that recently game article here. I'll look at the pictures and wonder if people do not know how old houses looked like?
Posted on Reply
#17
Minus Infinity
AMD could run a new advertising campaign "Check out the size of my package"
Posted on Reply
#18
Craptacular
Zazigalkaray tracing on entry level cards is useless. change my mind.
A 4060 has the performance of a 2080, you can play a lot of ray traced games at 60 fps at 1080p or higher without having DLSS enabled.
Posted on Reply
#19
chstamos
Vayra86They're just gonna do another Polaris and Nvidia will have no issue undercutting them if they do succeed at midrange.

And then in two years time we wonder why they can't compete even with an x70.

This not doing high end is a loser's strategy. It never lasts. The only logical result is AMD will zig zag back into high end costing them another 5-6 years.

The whole reason Nvidia dominates now is because they kept progressing while AMD was rebranding Hawaii XT and the 7970 for the second time, and then fumbled onwards into Fury X and Vega, only to arrive at RDNA1 far too late to really get RDNA2 going with RT proper. At that point their focus was still on the basic architecture, RT was 'a bonus feature'. RDNA3: focus on chiplet, RT was an afterthought. They're just too slow, and throughout all of this, the elephant in the room is not that you need to keep making last gen's performance GPUs, you need to progress your stuff ever forward. Every time AMD practically ground to halt, they lost market share. Polaris's revenue success came not from gaming, but from crypto.

Intel isn't really clawing in market share either with their A770, despite being dirt cheap.
Isn't the A770 slightly slower than the 6600XT while costing about the same? How does that qualify as "dirt cheap"? Dirt cheap would be 150 dollars for their performance (at which point they would be selling at a massive loss).
Posted on Reply
#20
AusWolf
Minus InfinityAMD could run a new advertising campaign "Check out the size of my package"
Yeah, but they wouldn't want to proudly show how tiny it is with that campaign, would they? :D
Posted on Reply
#21
_roman_
The graphic card driver is also part of the purchase. Therefore Intel should costs 40 % of the same performance amd card.
Posted on Reply
#22
AusWolf
_roman_The graphic card driver is also part of the purchase. Therefore Intel should costs 40 % of the same performance amd card.
If that was true, then Moore Threads cards should be free. :D
Posted on Reply
#23
chstamos
_roman_The graphic card driver is also part of the purchase. Therefore Intel should costs 40 % of the same performance amd card.
Yeah, I never got the "intel value king" proposition. I would be willing to purchase an intel card, with all their disadvantages, if they were "value kings" for gaming, but they're actually a downright lousy proposition. Pricing at AMD/nVidia levels for worse compatibility, worse drivers, and worse performance is not my idea of value for money just because the card has more vram.
Posted on Reply
#24
Dwarden
one day ... one may see 1440p capable 75W GPU w/o need to external power :)
Posted on Reply
#25
OneSorcerer
Please release a 8600xt performing like a 7700xt with 16gb <= 215mm for my next itx build.
Posted on Reply
Add your own comment
Nov 6th, 2024 09:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts