• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's RDNA 4 GPUs Could Stick with 18 Gbps GDDR6 Memory

Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Ray-tracing.


Counter-Strike 2.

View attachment 344969
1. The 7900 XTX is not a competitor to the 4090. They're not even remotely priced in the same range.
2. Ray tracing doesn't run properly on anything except for the 4090.

It is a problem. Because the halo product sells all the other siblings.
I would prefer to buy Radeon RX 9900 XT when it appears with RDNA 5, than the low-end-mid-range RX 8800 XT which succeeds RX 5700 XT - RX 6600 XT - RX 7600 XT.
The post I quoted proves otherwise.

The 8800 XT is positioned to be at roughly 7900 XT level by rumours. Where you get that it's a 7600 XT successor is beyond me.
 
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
Ray-tracing.
As long as all out games are Raster-RT hybrids this is less of an issue. Until we start getting pure RT games that run exclusively or RT hardware i very much doubt RT becomes the deciding factor. Also everyone knows Nvidia is faster in RT but i would not call AMD's RTX 30 series like RT performance that bad.
Counter-Strike 2.
What a GRE got to do with this?
It is a problem. Because the halo product sells all the other siblings.
With Nvidia's mindshare i very much doubt RX 9900 XTX or whatever will be able to outsell or outperform 5090 even of it costs a 1000 compared to Nvidia's 2000.
People spending 1000+ generally already go for the best. Nvidia has been for years selling their cards on mindshare or software, more than hardware.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
The 8800 XT is positioned to be at roughly 7900 XT level by rumours. Where you get that it's a 7600 XT successor is beyond me.

Die size is rumoured to be around 200 mm^2. In the ballpark of RX 7600 XT.

RX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.

What a GRE got to do with this?

Latest graphics card TPU review. https://www.techpowerup.com/review/?category=Graphics+Cards&manufacturer=&pp=25&order=date
Used because it shows current state of affairs.
 
Joined
Jun 2, 2017
Messages
9,127 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
With Nvidia's mindshare i very much doubt RX 9900 XTX or whatever will be able to outsell or outperform 5090 even of it costs a 1000 compared to Nvidia's 2000.
People spending 1000+ generally already go for the best. Nvidia has been for years selling their cards on mindshare or software, more than hardware.
How does Nvidia mind share work? When I built my first PC I used an AMD CPU and remembered those Super Bowl commercials and got myself a GTS 450. Years later I had moved to AMD and was at the PC store. The tech told me I should check out Sapphire and I looked at him like he had 2 heads. Thankfully Nvidia did enough to me personally to make me not want to use them and realize that Sapphire get some of the best binned chips for AMD. Now I enjoy Gaming and the only setting I change usually is turning off Vsync. Maybe I have been lucky but I am enjoying my current PC more than ever.
 
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Die size is rumoured to be around 200 mm^2. In the ballpark of RX 7600 XT.
What's die size got to do with it? Based on this comparison, the 4060 should sell for $150 and the 4080 for $500, they're so tiny chips.

RX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.
I don't estimate the importance of memory bandwidth that high, but we'll see.
 
Joined
Feb 21, 2006
Messages
2,221 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Die size is rumoured to be around 200 mm^2. In the ballpark of RX 7600 XT.

RX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.

So are you expecting something in between 7800XT - 7900GRE performance since it will have a 256-bit memory bus?
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
What's die size got to do with it? Based on this comparison, the 4060 should sell for $150 and the 4080 for $500, they're so tiny chips.

Yes, but nvidia is a monopoly.

So are you expecting something in between 7800XT - 7900GRE performance since it will have a 256-bit memory bus?

Not only because of the memory bus itself. But we shall see. Let's wait. But honestly, better expect a failure, than a success, so in the end you would be pleasantly surprised, than terribly disappointed.
 
Joined
Jun 2, 2017
Messages
9,127 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Do we have any specs for these cards?
 
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
RX 7900 XT level of performance will not be reached with 500ish GB/s memory bandwidth. Forget it.
With GRE? No of course not. But with a new generation product - yes it's very possible.
How does Nvidia mind share work? When I built my first PC I used an AMD CPU and remembered those Super Bowl commercials and got myself a GTS 450. Years later I had moved to AMD and was at the PC store. The tech told me I should check out Sapphire and I looked at him like he had 2 heads. Thankfully Nvidia did enough to me personally to make me not want to use them and realize that Sapphire get some of the best binned chips for AMD. Now I enjoy Gaming and the only setting I change usually is turning off Vsync. Maybe I have been lucky but I am enjoying my current PC more than ever.
You're asking the wrong person. I've owned roughly the same number of Nvidia and AMD (ATI back then) GPU's.

My worst experience was with Nvidia during their bump gate scandal where my 8800 GTS 320 kept dying and had to be revived in an oven - albeit temporarely. It was also a second hand EVGA product so i had no warranty either. Currently im on 2080 Ti that i managed to buy for a reasonable price before the latest crypto boom sent prices to the sky. Also made more than 1k on it by mining on the side. If i had to buy a new card now it would likely be AMD as my modded games require more VRAM and i despise the new power connector Nvidia mandates even for 4070 Super, a <250W card that could easily be powered by a single 8-pin.

My fear when buying Nvidia is the next feature they lock me out of when they release their next series. I've already locked out of ReBAR that 30 series introduced and DLSS FG that 40 series introduced. Im sure 50 series will further widen the gap.
 
Joined
Jun 2, 2017
Messages
9,127 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
With GRE? No of course not. But with a new generation product - yes it's very possible.

You're asking the wrong person. I've owned roughly the same number of Nvidia and AMD (ATI back then) GPU's.

My worst experience was with Nvidia during their bump gate scandal where my 8800 GTS 320 kept dying and had to be revived in an oven - albeit temporarely. It was also a second hand EVGA product so i had no warranty either. Currently im on 2080 Ti that i managed to buy for a reasonable price before the latest crypto boom sent prices to the sky. Also made more than 1k on it by mining on the side. If i had to buy a new card now it would likely be AMD as my modded games require more VRAM and i despise the new power connector Nvidia mandates even for 4070 Super, a <250W card that could easily be powered by a single 8-pin.

My fear when buying Nvidia is the next feature they lock me out of when they release their next series. I've already locked out of ReBAR that 30 series introduced and DLSS FG that 40 series introduced. Im sure 50 series will further widen the gap.
It was rhetorical.

That is exactly what happened to me but the kicker was that they did not even inform me when they disabled SLI on the GTS450. Imagine how stupid I felt after I had sold them to a friend.
 
Joined
Jun 13, 2012
Messages
1,388 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
The thing is even today AMD does not have even half the money for R&D for both sectors they are in. I feel people are looking at the Glass half empty argument without looking at the positives.

1. For the life of the PS5/PS5 Pro and Xbox 1 games will be created on Ryzen/Radeon platform. As they age, programming on those will improve that will mean an advantage for console ports using AMD PCs. It is already happening.

2. AMD are making crazy money on their APUs. The Steam Deck is in the top 10 in Global sales on the Steam platform consistently. The release of the MSI Claw (even if it is for future proofing) is evidence of how far AMD has come in the APU space. This will also mean more programming for Ryzen/Radeon as Games start to get ported (likely from Mobile) onto these platforms. In fact I am confident that someday soon on Amazon you will be able to buy a Ryzen based handheld with those retro Roms like PS/PS2/Dreamcast and Arcade. I have already built one with my 5600G (desktop).
So depending on intel in how fast and if they integrate their arc gpu tech in cpu package. Those 2 segments could not be $ for amd for much longer. Intel has only been serious in gpu for what 1year? maybe 2 at most and they have gpu that price wise made waves as an option including major performance strides with each driver update. They could be a real threat to that market in a year or 2 if they put the effort in to it.
 
Joined
Jun 2, 2017
Messages
9,127 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
So depending on intel in how fast and if they integrate their arc gpu tech in cpu package. Those 2 segments could not be $ for amd for much longer. Intel has only been serious in gpu for what 1year? maybe 2 at most and they have gpu that price wise made waves as an option including major performance strides with each driver update. They could be a real threat to that market in a year or 2 if they put the effort in to it.
Of course nothing lasts forever but Intel are still at least 2 generations away at the IGPU space. AMD will be releasing new APUs as well and Intel have to solve the power draw/performance problem as well. I am not saying they can't just not yet. As an additional thought there is nothing preventing Sony or MS from jumping into this space with portable PS and Xbox/Gamepass handhelds. As it stands right now the MSI Claw is the only handheld that does not come with AMD.

When we start to get laptops with just these APUs in them I expect they will sell well too. Acer has one that they announced for $599 with a 8700G laptop chip.
 
Joined
Mar 13, 2021
Messages
471 (0.35/day)
Processor AMD 7600x
Motherboard Asrock x670e Steel Legend
Cooling Silver Arrow Extreme IBe Rev B with 2x 120 Gentle Typhoons
Memory 4x16Gb Patriot Viper Non RGB @ 6000 30-36-36-36-40
Video Card(s) XFX 6950XT MERC 319
Storage 2x Crucial P5 Plus 1Tb NVME
Display(s) 3x Dell Ultrasharp U2414h
Case Coolermaster Stacker 832
Power Supply Thermaltake Toughpower PF3 850 watt
Mouse Logitech G502 (OG)
Keyboard Logitech G512
AMD will have a lead in the iGPU space for a while I think with their knowledge in Consoles and things like Strix Point etc and their success.

Intel I would hope would focus on desktop/datacenter for Celestial then we see in Druid/E series parts a funnel down into iGPU power/efficency.
Look at how Alchemist has performed/developed I am near 100% sure there has been a massive accidental bottleneck put in in the hardware and I would guess it was in the scheduler/load store functions as moving from 1080p to 1440p in most games on the arc has been single percentage points drop in performance RT on or Off. Yet on nearly all other manufacturers cards you can see a respectable drop in performance or should I say a respectable gain dropping the resolution.

Get that fixed for Celestial/Druid and then they have a real contestant into the iGPU space.


I suspect with RDNA4 and the now cancelled top end offering they either went too far on the chiplet design and realised it would either need a full rework (RDNA5/Successor arch?) or wonder if they had intended for HBM3/e to be used on the top end parts similar to the MI300 but the AI craze has just priced them out of the market again.
 
Joined
Jun 1, 2010
Messages
377 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Well if you call struggling by losing only to 4090 outright (in both raster and RT), then id say AMD is not doing so bad at 4080 performance.
Not to mention, that the sole 4090 users are either prosumers like content creators, broke 3D design folks, that are unable to buy themselves Pro line GPUs, the AI freaks, that want to enter that noncence bubble at all costs, and the "1337" "gamur" crowd, with more money than intelligence, just to show off their soapy upscaled 4K DLSS screenshots, somewhere on forums and social network. The most of people do not even look at that area, enjoying their "modest" gaming with 4080/7090XTX as most. Most people look forward to to 7800XT/4070 Super, and the sales just show that.

So I don't see any trouble leaving that tiny segment for Ultra rich kids and selfish menchildren, considering that among all premium products, AMD has more profits from enterprise anyway. No point for them to sell many premium GPU products if they can sell WS cards instead of top tier "gamer" counterparts, for people that needs them, ad will gladly pay that premium. And gamers can keep up with something akin to what being used in consoles (RX6700), anyway.
But most of gamer segment come from low/mid end GPUs anyway. No point to invest in something, that is basically a placeholder. And even if there won't be any successor as RDNA5, AMD can live with just such low cost cards, until they feel like they are able to release something top. They did it with RX580, Radeon VII (Vega II), RX5700, until they make RX6800XT/RX6900, that sold like the hot cakes, and was basically on par with nVidia rivals.
Thus, there's absolutely no point into putting the expensive VRAM, in such temporary and low cost. It's more reasonable to use newest GDDR7 along with breakthrough solutions that may, or not be be RDNA5.

As of Intel. I can't say they are absolutely hopeless. As it's not guaranteed that they will make the great achievements with Battlemage. However, as much as I don't like intel, I must admit, they have already made a significant progress in the GPU division. I dare to say, even bigger than AMD did for a decade, but instead within of couple of years. Of course, they have miles bigger R&D budget, but still. From what I've seen and read, Intels compression/decoders are miles better than AMD, even on lowest end cards. Their RTRT is also better. And this considering, Intel is on huge decline, and selling their assets left and right. AMD on the other hand, is blooming, but still reclutant to invest in their consumer areas like Radeon, because they went all-in on Enterprise, because it doesn't rely heavilly on AMD drivers, and they they don't need the streaming/decoding capabilities anyway. So AMD can invest less, while having more. At this point AMD is seems to be even greedier, than Nvidia. They lacking at every area, but still get the hubris to ask the premium for absent features/options.
 
Last edited:
Joined
Apr 14, 2022
Messages
745 (0.78/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
AMD gpus performance is not bad. Actually it is great.
The package is bad.
Intel made FSR look like a joke in their first attempt.

I would sacrifice the performance crown for a better package overall.
It took years to change the mindset from Intel Core to AMD Ryzen but it did happen.
That’s why most of us have ryzens now.
It may happen on the gpu side if nVidia continues asking 1000+ for midrange cards.
 
Joined
Feb 21, 2006
Messages
2,221 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
It may happen on the gpu side if nVidia continues asking 1000+ for midrange cards.
Not sure of this last part nvidia has such a huge mindshare and following that their users don't have a problem paying 2k for a 4090. I don't have access to the sales data but I wouldn't be surprised if the 4090 is the best seller in the whole 4000 series.

I totally agree that we need to get mid range back in the $500 range and high end to $1000 and under but that cat seems to be out of the bag now and may never go back.

And with both companies shifting their focus more on AI and putting more resources into it we may continue to see a squeeze or pricing going up on discreet gpu's.
 
Joined
May 25, 2022
Messages
117 (0.13/day)
The difference is actually Nvidia is a much more formidable competitor than Intel ever was and still.

You remember all the talk about engineer CEOs? Well, Nvidia still has the engineer as the CEO and not only that, he is the one who founded the company! That's if Intel still had Gordon Moore, Robert Noyce, and Andy Grove, those that are still considered legendary.

Nvidia's engineering IS really good. They consistently push out reticle-sized dies. Sure they make mistakes, but over a long period of time, nowhere near badly as both AMD and Intel. Nvidia made handful of mistakes while AMD and Intel stumbled like they were peg-legged. Remember though, Nvidia has one of the highest if not the highest employee satisfaction ratings. No wonder why they are successful!

That's part of why AMD's GPU division is struggling, and CPU is not.

Battlemage should in theory be a lot better even if places itself in the same relative position to competitors as Alchemist. They can fix the idle/low load power consumption issue, ReBar issue, and hardware quirks from lack of experience such as abnormal resolution/detail scaling. ReBar for one is a big thing, as it automatically rules out/discourages most of older systems, which is counterintuitive considering how affordable ARC cards are. ReBar doesn't just affect older systems. Recently it had a bug where some systems had half-working ReBar with Vulkan API. So random-ass low performance in modern games might be due to lack of ReBar performance having a great impact on ARC(where it's negligible on competitors).

I know from tracking Intel GPUs for a long time what was thought to be software/driver problems turned out to be a hardware problem. No doubt such problems exist on Alchemist. In fact even if driver bugs exist, it might be easier to fix on Battlemage and successors.
 
Joined
Apr 14, 2018
Messages
649 (0.27/day)
Not sure of this last part nvidia has such a huge mindshare and following that their users don't have a problem paying 2k for a 4090. I don't have access to the sales data but I wouldn't be surprised if the 4090 is the best seller in the whole 4000 series.

I totally agree that we need to get mid range back in the $500 range and high end to $1000 and under but that cat seems to be out of the bag now and may never go back.

And with both companies shifting their focus more on AI and putting more resources into it we may continue to see a squeeze or pricing going up on discreet gpu's.

I highly doubt that unless the caveat is all sales including prosumer, small businesses, and those on a budget when it comes to commercial uses. In terms of consumers such as gamers or mixed use, 4090 probably gets crushed in sales by the 4070/4070s.

Prices will never return to what use to be the relative norm, people just finance everything from what I’ve seen and are probably drowning in debt if everyone and their mother is buying a 7900/4080/4090. I’ve said it before, but we’re continuously moving towards GPUs of any kind being a luxury and gaming on PC being largely unaffordable for an average person.
 
Joined
Jun 2, 2017
Messages
9,127 (3.34/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
The difference is actually Nvidia is a much more formidable competitor than Intel ever was and still.

You remember all the talk about engineer CEOs? Well, Nvidia still has the engineer as the CEO and not only that, he is the one who founded the company! That's if Intel still had Gordon Moore, Robert Noyce, and Andy Grove, those that are still considered legendary.

Nvidia's engineering IS really good. They consistently push out reticle-sized dies. Sure they make mistakes, but over a long period of time, nowhere near badly as both AMD and Intel. Nvidia made handful of mistakes while AMD and Intel stumbled like they were peg-legged. Remember though, Nvidia has one of the highest if not the highest employee satisfaction ratings. No wonder why they are successful!

That's part of why AMD's GPU division is struggling, and CPU is not.

Battlemage should in theory be a lot better even if places itself in the same relative position to competitors as Alchemist. They can fix the idle/low load power consumption issue, ReBar issue, and hardware quirks from lack of experience such as abnormal resolution/detail scaling. ReBar for one is a big thing, as it automatically rules out/discourages most of older systems, which is counterintuitive considering how affordable ARC cards are. ReBar doesn't just affect older systems. Recently it had a bug where some systems had half-working ReBar with Vulkan API. So random-ass low performance in modern games might be due to lack of ReBar performance having a great impact on ARC(where it's negligible on competitors).

I know from tracking Intel GPUs for a long time what was thought to be software/driver problems turned out to be a hardware problem. No doubt such problems exist on Alchemist. In fact even if driver bugs exist, it might be easier to fix on Battlemage and successors.
AMD was pretty much in the same boat as what Intel is today when they bought ATI. The first product that was AMD was Polaris and that was a success. Indeed AMD did not have enough engineers (or resources) to improve but they took a gamble on Vega with HBM (What a joy to Water cool) but Mining took over the scene and put everything awry. Today Radeon is very competitive and Wizzard himself references AMD software as a reason to buy Radeon.

What Nvidia is good at is making something and making people want it, even though it might be in 1% of Games. The narrative then picks it up and it becomes a feature. I look at how Frame Gen was received and how that morphed into a good thing. The key though is a lot of the talk about AMD's response is real but considered snake oil by the community. I remember how people use to say Gsync was much better than Freesync because it was a hardware module. Sound familiar.
 
Joined
Feb 24, 2023
Messages
3,014 (4.73/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
1. Yes, GPUs of 7900 XT performance and below don't need GDDR7. Not a big deal.
2. Yes, AMD aren't trying to compete. If we don't count Germany and a couple other countries where AMD production sells we're in the 99:1 NVIDIA win situation. Only because AMD GPUs of same price match or barely exceed the raster performance and lose in everything else.
3. Prices will stabilise, the bubble isn't going to grow forever. The most recent examples include real estate crisis of late 00s.
4. I disagree with "Intel made FSR look like a joke." FSR looks like a joke by itself, it required no help from competition. I bought my GPU more than a year ago and FSR is still in the same shape as it was when I bought this card, give or take two games where things became ever so slightly better after FSR 2.2 introduction. FSR 3.1 would've been late to the party even on the first day of 2023; yet it's almost mid-2024 and 3.1 is absolutely nowhere to be seen.
5. "6900 XT is on par with Ampere" is a deranged statement. It barely outperforms 3080 at 4K, sometimes even loses to it, also lacking any DLSS and RT performance whatsoever, whilst being far more expensive. More VRAM doesn't mean anything if framerate is still lower.
6. "4060 for $150 and 4080 for $500." I mean, these are exactly as cut-down as it gets. Halving their MSRP would've represented reasonable pricing. $220 and $620 respectively would be completely fine.
7. We don't need beating an NV halo GPU but we do need a price war. 7900 XTX is a great GPU by itself, it's just $1000 is beyond schizophrenic for it. $570ish would've striked hard, leading to much more pleasant market. Never happened though.
 
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I disagree. High-end RDNA 4 (Navi 48) is rumoured to be around 7900 XT level in performance at best. GDDR6 is more than enough there. GDDR7 would only increase manufacturing costs with probably no performance advantage.

I think it's safe to assume that GPU would have a 256-bit bus, which would be a pretty significant bottleneck with regular GDDR6. You can see how limited the 7900 GRE is by its stock memory speed.

Initially GDDR7 was supposed to launch at 32 Gbps and go up to 36, but they're actually going to start with 28 Gbps, probably to reduce costs.
 
Joined
Aug 13, 2010
Messages
5,472 (1.05/day)
GPUs: chocking to death with video memory bandwidth limitations
Some users: let em have cake, i'm fine with this situation.
 
Joined
Jun 27, 2011
Messages
6,765 (1.38/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
I don't have access to the sales data but I wouldn't be surprised if the 4090 is the best seller in the whole 4000 series.
It is clearly not from the Steam data.
 
Joined
Jan 14, 2019
Messages
12,337 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
You remember all the talk about engineer CEOs? Well, Nvidia still has the engineer as the CEO and not only that, he is the one who founded the company! That's if Intel still had Gordon Moore, Robert Noyce, and Andy Grove, those that are still considered legendary.
Guess what, AMD also has an engineer CEO. ;)

Congrats for the rest of your post, you couldn't have written a better Nvidia advert if you tried.

I think it's safe to assume that GPU would have a 256-bit bus, which would be a pretty significant bottleneck with regular GDDR6. You can see how limited the 7900 GRE is by its stock memory speed.

Initially GDDR7 was supposed to launch at 32 Gbps and go up to 36, but they're actually going to start with 28 Gbps, probably to reduce costs.
Altogether, the GRE is not a bad product. I tend to look a GPU as the sum of its parts, and not the bottleneck that one of its parts may or may not show. It's not like you can upgrade your VRAM after all.
 
Top