• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

Joined
Aug 21, 2013
Messages
1,908 (0.46/day)
It is. And it's, in my opinion, currently some $250 overpriced. That'd be how much I value Nvidia's driver support and ecosystem features, $250.
Pretty control panels don't make a quality KMD. Much less a feature complete one. Has AMD already implemented DX11 driver command lists?
What driver support exactly? That they support their cards 10 years instead of 8 years from AMD?
Feature complete it is for AMD. Nvidia is playing catchup with the app.
Most games released now are DX12. Why devote resources to DX11?
I expect a better control panel when the leather jacket man charges a premium for it, Nvidia claims they're a software company after all.
AMD has their share of screw ups, however the mindshare works when reviewers are always finding reasons to criticize AMD.
Exactly. For a "software company" their consumer facing software sure sucks.
And how are we completely sure it's gonna be just $500?
Nothing is ever certain in life except for death and taxes but it makes sense. Since RDNA4's high end (meaning 899 and 999 cards) were canned then that leaves only the 500 range 8700/8800 series.
To leave the gDPU market. Have you noticed that they haven't released a new software since October?
That is two months ago!
One month ago. The release was at the end of October. Is there something not working that they need to release new drivers NOW?
Now, if Radeon 8000 turns out to be a flop and the sales do not improve, their market share will go under 5%, maybe even 2-3%.
In which case they will no longer have money for R&D, and they will stop all dGPU projects, if currently any.
Ah yes. How many times have the doomsayers been predicting AMD's downfall?
Im still waiting. When Intel entered the market people were claiming that AMD will quickly fall to under 5% and yet here we are years later with Intel at 0%.
Tell me what was the last AMD GPU flop? I think it might have been either Fiji due to it's 4GB or VII. That was more than six years ago.
People don't play games with a frame counter visible. While it is essential, don't expect people out there looking how to secure a 60+ fps frame rate. Even 20-30fps will look as smooth gameplay to many out there and if you ask them they wouldn't know what framerate they have. Don't forget that consoles many times are targeting 30 fps not 60.
Consoles are fast moving to 60fps. Even Mark Cerny said that he was surprised at this development but it makes sense. Standards have raised. Once console players got taste of 60fps they were bound to reject 30fps even if it came with better visuals. On PC the minimum acceptable framerate has risen even higher. I now see many people saying 90 is their minimum. 144Hz monitors are very cheap now so it makes perfect sense.
Also someone having payed $2000 would want to see that that $2000 graphics card can be 3-4 times faster than the $1000 model from the other company. And for that person, 60fps will be more than enough.
A buyer who spends 2000 on a card will not be satisfied with 60fps even if it's 4x faster than competition. Why would this buyer care about competition at that point? They already bought the card. Now they want to enjoy maxed out high refreshrate gaming, not some 60fps slog in a tech demo that's nice to look at for a while but gets boring really fast.
you tell me that they wouldn't lower prices? They will. Even in gaming, the fact that they lowered prices and released the Super models when AMD's prices got too low, shows that they will react.
GDDR7 will not be cheap and Nvidia will use it across the board on 50 series from only one supplier. Super models are a scam. Slightly lower prices for miniscule performance improvement enticing people to upgrade. In reality it's one step back from a two step forward situation.
But sure. You wait your lower blackwell prices. I recon you'll be waiting a while...
That was literally never the case. I had a bunch of high end cards the last 20 years. They all struggled to play the demanding games of their time. Watchdogs 2 was dropping to 20 frames at 1080p on my brand new 1080ti. What are you talking about man?
Yes it was. Well before this whole RT thing came about and tanked performance on even the fastest cards to near unacceptable levels...
I really dont know how you managed 20fps on a 1080 Ti on WD2 as TPU's review shows it achieving 40+ even at 4K and this was in 2017 when 4K was much more rare than today. Unless you ran at 8K or 4K with a really weak CPU it makes little sense. https://www.techpowerup.com/review/nvidia-geforce-gtx-1080-ti/27.html
 
Joined
Jan 2, 2024
Messages
588 (1.74/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
Most games released now are DX12. Why devote resources to DX11?
This is the only itchy part of these cards that I don't like. When it comes to DX/Vulkan features the cards I pick somehow end up a step behind everything else less than two months later. I've been stuck on DX12_0 level while a whole bunch of AAA stuff has settled into DX12_1 and DX12_2, which seems to be the last stagnation for DirectX but who knows?

Personally, I don't really care. DX11 is still where peak gaming is at and the majority of VR is still Unity engine under DX11. Which is great because on an older card, DX12 performance just absolutely tanks hard. Same with Vulkan. However when it comes to streaming...All the new stuff made and put straight into the spotlight isn't just DX12 but DX12_2, which precludes me from joining any of them. Some games just straight up do not launch. There's a lot to filter too.

1733275844737.png


We've had more than enough years for AMD to ship a DX12_2 card but only see them as early as 6000 series, which is extremely annoying.
I wanted to look at the heavily discounted 5700 XT as it was offered but was immediately turned off by every stupid ass minutiae like that.
"Feature complete" is one thing under nVidia and AMD zooms off a whole other direction two whole generations while cards like the 2070 sat at the finish line.
GDDR7 will not be cheap and Nvidia will use it across the board on 50 series from only one supplier.
GDDR7 seems to have its own situation with packaging. There will be improvements to it over time that make next year's purchases look like MASSIVE early adoption tax. It's gonna hurt.
 
Joined
Dec 25, 2020
Messages
6,835 (4.74/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Of course it works, just like it does with motherboards. Video cards should have sockets with identification (1500, 1501, 1502, 1503, etc.). Then, on some website, the owner of the card would see which GPUs the card accepts and would buy and install a new GPU on the video card, just like what is done with motherboards. The user would only need to update the video card BIOS and replace the GPU. Very easy to do...

In the video card market, AMD makes money by selling GPUs only, it does not make money by selling (the expensive) VRAM and other components of the cards. If people could replace only the GPU, video cards would become much more affordable for consumers because they would not have to pay for all the other components of the cards.

This approach has several issues you haven't thought through. Like I mentioned earlier there's the memory type issue. Anyone buying an upgradable board would want next-generation GPU cores to be supported on it. But the memory present on the base boards wouldn't have changed. Same chips same latency, same speed. Memory capacity, for example. HBM package could work around that but it's going to be extra expensive.

Then there's profits. The need to certify and invest in hardware designs. Piracy and counterfeiting concerns. It's really not simple to release an user-upgradable graphics card.
 
Joined
Oct 24, 2022
Messages
218 (0.28/day)
i am pretty sure your idea could technically work, but it would HELLA expensive, like incredibly so. It was already tried with phones for example. And most of the time the Problem is: Money

Like writing software-support for all the possibility, how would drivers work.

And for what purpose? Saving 50 $ per card (probably less)

And User Error is also a thing, how likely is it that a User breaks the VRAM Module (or what ever it is called)

How would the cooling solution work for the card, you need pressure for the block to work

and so on and on and on.

I am sure (that's a pretty rare thing) that it would never fly, crash and burn.

And another keyboard engineer spoke...
 

AcE

New Member
Joined
Dec 3, 2024
Messages
23 (11.50/day)
And you're right about tea. The right amount of sugar for tea is none. I visited my parents in the states a few years back and got some "unsweet tea" at a McDonald's, then proceededc to nearly gag on it because it was so sweet.
Unsweetened tea is kinda disgusting while games without RT are still very enjoyable, my 2 cents.
One month ago. The release was at the end of October. Is there something not working that they need to release new drivers NOW?
My guess is they work on bugfixes and the big "end of year driver" like usual, that and because nothing was really needed is probably why they didn't release for 1 1/2 months now.
Tell me what was the last AMD GPU flop? I think it might have been either Fiji due to it's 4GB or VII. That was more than six years ago.
Radeon VII was at least good for workstation users, it was just a loud card, otherwise fine. Fury X was good but competed against a 980 Ti which was too good - and then it aged kinda badly because of just 4 GB, but this took years to "age the card badly" so it's a relative non-factor, especially for people who can afford 700$ cards in the first place.
On PC the minimum acceptable framerate has risen even higher. I now see many people saying 90 is their minimum.
Nah, 60 FPS will always be bottomline "good" standard because it's easily fluid enough. Enthusiasts will always have higher standards and they always had, 100 FPS + 100 Hz in CRT times, now it's minimum 144 fps + 144 Hz and ultra esports players want at least 240 Hz and 400-500 FPS.
and DX12_2, which seems to be the last stagnation for DirectX but who knows?
For DX12 yes, but Microsoft is probably working on DX13 with the GPU vendors right now. Question is if they will really name it "DX13" though.
All the new stuff made and put straight into the spotlight isn't just DX12 but DX12_2, which precludes me from joining any of them.
That's sad because Alan Wake 2 is a great game and only runs on RX 6000 and higher. Technically 5700 XT had something comparable to a Mesh Shader but it was not integrated properly and thus can not be used like that for AW2. But it runs faster than a GTX 10 series at least.
I wanted to look at the heavily discounted 5700 XT as it was offered but was immediately turned off by every stupid ass minutiae like that.
Yep, the card is too outdated now because it lacks DX12_2. Unless you don't care about AAA games then it never mattered.
 
Joined
Dec 5, 2020
Messages
7 (0.00/day)
Transformative RT as big part of the industry love to talk about is a pipe dream. Right now RT has been used in 3 scenarios.

1 : to step up in scenarios when rasterisation trick clearly fail -> both Spiderman game building's reflection -> work nicely, perf to visual cost is OK, actually run not too shabby on AMD hardware.

2. To make game with a "cartoony" visual identity have a much more well-rounded, grounded, consistent visual identity by using some form of cheap GI -> Tiny Glade -> small, but noticeable visual positive impact, relatively cheap to run on both AMD and Nvidia

3. "transformative RT" To improve immersion and the overall lighting realism -> Cyberpunk 2077 or Alan Wake 2 -> really expensive to run for all GPU, even harder on AMD.

The problem with number 3, the poster child of RT, is to get a good result, RT alone is useless, you also need :

1. Great and detailed texture (add vram usage and development cost)

2. Good high poly model (add RT performance cost and also development cost)

3. Great animation (big development cost)

And at last but not least, the most important, because without it even path-tracing look bad.

4. Spot-on material properties, without it even the best lighting system will give you unrealistic result if different object with different material don't behave correctly with light (big increase in development cost and require advance knowledge that most studio, today, lacks in different amount)

With the video game industry already having a huge cost problem, can someone explain me how that transformative RT future will come to reality ?

A much more probable future (not for now, obviously) is let's render a frame in raster with "good enough and cheap enough" quality and ask an AI model to make it look realistic in real-time...That is the only way you truly save on development time and cost.
Yeah, I'm on board with this line of thinking.

To me RT and path tracing will always be unobtainable, just out of arms reach for most people. I use path tracing for my art (3d fractals), and it's pretty easy to crank up settings and slow down renders. One path traced 4k images is nowhere close to real time with it, so there will always be Nvidia, or whoever is making hardware, to release something like Half Life 2 RTX that will only run at 1440p 60 on the highest end using upscaling and frame gen. Maybe I'm using some hyperbole there, but not much. I remember Quake 2 RTX being too much for RTX 3080 to do maxed out at 1440p
 
Joined
Dec 25, 2020
Messages
6,835 (4.74/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Yeah, I'm on board with this line of thinking.

To me RT and path tracing will always be unobtainable, just out of arms reach for most people. I use path tracing for my art (3d fractals), and it's pretty easy to crank up settings and slow down renders. One path traced 4k images is nowhere close to real time with it, so there will always be Nvidia, or whoever is making hardware, to release something like Half Life 2 RTX that will only run at 1440p 60 on the highest end using upscaling and frame gen. Maybe I'm using some hyperbole there, but not much. I remember Quake 2 RTX being too much for RTX 3080 to do maxed out at 1440p

Every new technology is at first "unobtainable". There's a post a few pages back that I've been thinking about ever since I read it, that inadvertently managed to abridge the whole debate regarding RT and newer graphics techniques currently considered too "heavy" to be feasible in a completely neutral manner:

ray tracing is the new pixel shader, anyone who opposes to that is silly

It brought me back to the days I didn't have a Pixel Shader 3.0/DirectX 9.0c compatible graphics card. And then to the days I had a DirectX 10 graphics card. And of course, more recently, to my experiences with Kepler, which is a 11_0 level architecture that was maintained until relatively recently. It aged like milk.

It's understandable for consumers not to always have bleeding edge hardware. And PCs have been lasting a lot longer than they used to be. But we're closing in at what, 6 years since the first RT-capable cards released (RTX 20 series). Market penetration is slowly improving, and was probably hampered by a poor global economy and rising manufacturing prices that slowed adoption considerably - that, and PCs have largely gotten "good enough". The pixel shader analogy works well; even though I think it will be a relatively long time until most games actually require RT-capable hardware to even boot. It'll probably start happening once support for Pascal and RDNA 1 is eliminated and game developers start demanding RTX 20-series or RX 6000-series hardware to run games. But even then, most should still at least boot and run, if slowly or glitchy.

Quake 2 RTX and *especially* Portal RTX were largely tech demos intentionally made to run poorly on Ampere. I wrote a scathing review of it on Steam at the time. I was so angry at it that I actually pitched the 7900 XTX as reasonable - if only I knew how badly that post was about to age. Almost kino, given my reputation around here. They should be disregarded, in my opinion, as a benchmark to how this technology works. The same will likely apply to the newly announced Half-Life 2 RTX - I guarantee you that it's going to run like absolute garbage on Ada cards.

I just hope that my "at least it's a reasonable product" didn't get anyone VAC banned a few months down the road. :shadedshu:
 
Last edited:
Joined
Jun 28, 2019
Messages
104 (0.05/day)
As I've been thinking for a year now, Raster ~ 7900xtx, Ray tracing ~ 4080 on average and therefore no more neck with Rt gymp oriented games.
Give it to us for $500/600 and you'll sell a ton of them.
 
Joined
Sep 19, 2014
Messages
54 (0.01/day)
I just use my 800+ Game library. Nvidia did something to me decades ago and I have not looked back since. You again are using your opinion of RT to make AMD seem bad because their RT support is not as strong. I don't subscribe to the 90% market share either when 4090s are being bought by China. If you saw what happened to TW 3 Kingdoms when it launched on Steam, you would understand how that could effect market share.
ooh i hope they didin do something very wrong using GPU by doing it, sharp edges and everything u know.

Nvidia features and driver support are just best of the best, using Nvidia we can safely play all Early A games also and other non Benchmark games whitout issues

As I've been thinking for a year now, Raster ~ 7900xtx, Ray tracing ~ 4080 on average and therefore no more neck with Rt gymp oriented games.
Give it to us for $500/600 and you'll sell a ton of them.
Lookin 8800Xt series spects
its more like 4070S performance +0-5%
maybe 499$

Folks are thinking too much about nothing right now.

Last I remember hearing was AMD was looking to stay out of the enthusiast end of GPUs and focus on bringing 4090 type performance in the $500 range. Rumors are just that, rumors. Hopefully though, they can hit that mark with their 8800XT. I won't hold my breath waiting, but it's nice to think about.
Not even close to 4090 whit those spects

8800XT is 4070super +5% max
 
Joined
Nov 13, 2024
Messages
50 (2.27/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5800X
Motherboard ASRock B550 Phantom Gaming 4
Cooling be quiet! Pure Rock 2
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Lookin 8800Xt series spects
its more like 4070S performance +0-5%
maybe 499$
Wasn't the 8800xt rumored performance somewhere between the 7900xt ~ 7900xtx cards?

And the 7900xt is around 18% stronger than the 4070S (Relativ Peformance in 4k ~ 1440p)

Not even close to 4090 whit those spects

8800XT is 4070super +5% max
Yeah, don't think there is any way it's as strong as a 4090 AND costs 500 ~ 600$ (but i can dream about it....)

If it hits 4080S Performance for 500$ i am happy.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
Wasn't the 8800xt rumored performance somewhere between the 7900xt ~ 7900xtx cards?

Where are the actual samples' performance and benchmarks leaks?

1733317432436.png
 
Joined
Nov 13, 2024
Messages
50 (2.27/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5800X
Motherboard ASRock B550 Phantom Gaming 4
Cooling be quiet! Pure Rock 2
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Joined
Jun 28, 2019
Messages
104 (0.05/day)
Lookin 8800Xt series spects
its more like 4070S performance +0-5%
maybe 499$

1733319272999.png

Do you really think that AMD after 2 years releases a card that does +10% on the 7800XT for $500 after this is already equal to the 6800XT?
It would be DOA!

Rumor and target (250~270watt) say that N48 replaces N31 in terms of costs, therefore similar performance in Raster but double or more RT on N32 (2 RA per CU and BVH in Hardware), like 4080/5070ti all around by AMD.
 
  • Like
Reactions: AcE
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
that's what I am asking... Maybe there was something new (which is highly likely) that i haven't heard/seen

As far as I know RDNA 4 shader is as fast as RDNA 3 shader, which means 0 performance improvement in raster, and some tweaks for the irrelevant for now ray-tracing.

4096 RDNA 4 shaders will be around RTX 4070S performance.

1733321268346.png


 
Joined
Nov 13, 2024
Messages
50 (2.27/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5800X
Motherboard ASRock B550 Phantom Gaming 4
Cooling be quiet! Pure Rock 2
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
As far as I know RDNA 4 shader is as fast as RDNA 3 shader, which means 0 performance improvement in raster, and some tweaks for the irrelevant for now ray-tracing.

4096 RDNA 4 shaders will be around RTX 4070S performance.

View attachment 374406

well fine.

But in the same article it says:
1733321831781.png

Performance around a 7900 xt, which is on average 18 % stronger than the 4070S in relativ perfomance 1440p.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
well fine.

But in the same article it says:

Performance around a 7900 xt, which is on average 18 % stronger than the 4070S in relativ perfomance 1440p.

That is the fake news part of the article :banghead: Don't trust it.
RX 7900 XT has 20 GB of VRAM, 5400 shaders, 336 TMUs, 192 ROPs, 800 GB/s memory throughput, you simply won't reach its performance with that mediocre Navi 48.
 
Joined
Nov 13, 2024
Messages
50 (2.27/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5800X
Motherboard ASRock B550 Phantom Gaming 4
Cooling be quiet! Pure Rock 2
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
That is the fake news part of the article :banghead: Don't trust it.
huh? Why use that article in the first place then..?

Also says a similar thing in the upper part of it:
1733322874960.png


Seems like selective bias to me... (at least in my Opinion)
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
It's written, please be careful and don't fall for the hype.

Bearing firmly in mind possible issues around the translation from Chinese, and the fact that this is just a rumor anyway, we’re told that RDNA 4 is just a “bug fix” and that these graphics cards are “similar to RDNA 3.”

1733323463013.png


For now they behave identically to GFX11. Which means RDNA 4 is equal to RDNA 3 performance-wise!
 
Joined
Nov 26, 2021
Messages
1,670 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
It's written, please be careful and don't fall for the hype.
Similar to RDNA3 might just mean the ISA which, at least for RDNA 3.5, is similar. That doesn't say anything about clock speeds. Recall that the 7800 XT has rather low clock speeds. In fact, it has lower frequency than the RX 7600, and its bigger siblings clock about 8% to 11% higher than it in TPU's test suite. N4 will allow slightly higher clock speeds and that combined with the slight increase in compute units should be enough for at least a 20% increase over the 7800 XT.
 
  • Like
Reactions: AcE
Joined
Nov 13, 2024
Messages
50 (2.27/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5800X
Motherboard ASRock B550 Phantom Gaming 4
Cooling be quiet! Pure Rock 2
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
It's written, please be careful and don't fall for the hype.
okay i can understand that, but aren't they (the die) fundamentally different.

Chiplet -> Monolithic

What they are talking about is the architecture not the gpu it self.

My knowledge in GPUs is heavily limited, so i WILL make mistakes.

But you are telling me:

The article you send to me, which says that RDNA3 is similar RDNA4 (just a bug fix) by an chinese leaker.

Also this chinese leaker said to expect performance in the ballpark of an 7900xt.

Trust the leaker in the first part, but not in the second? But why? Because you said so?

Maybe my reading comprehension is just lacking...
 

AcE

New Member
Joined
Dec 3, 2024
Messages
23 (11.50/day)
The 8800 XT (or whatever it will be called) will be at least 20% faster than 7800 XT, anyone who says otherwise is deluding him/herself. This was over 2 years in the works. Performance to be expected will be 7900 XT-XTX levels, very likely. As for RT performance, I honestly won't hold my breath cause AMD never did big steps there, but if they are now switching to a proper RT core like Nvidia, the performance jump will be huge - the question just is "how big".

Intel already prove that by using a proper full RT core, and not just RT "helper units", even smaller companies are able to make strides, their RT performance, wasn't half bad and more than anyone expected back then.
 
Joined
Sep 19, 2014
Messages
54 (0.01/day)
View attachment 374405
Do you really think that AMD after 2 years releases a card that does +10% on the 7800XT for $500 after this is already equal to the 6800XT?
It would be DOA!

Rumor and target (250~270watt) say that N48 replaces N31 in terms of costs, therefore similar performance in Raster but double or more RT on N32 (2 RA per CU and BVH in Hardware), like 4080/5070ti all around by AMD.
6800XT is 5% slower than 7800XT

8700XT is 10-15% faster than 7800XT

and looking spects it will be around there.
Maybe price can be 399-449$ not 500$
 
Joined
Jan 20, 2019
Messages
1,567 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
The RX 8800 XT will be the fastest product from AMD's next-generation, and will be part of the performance segment

Is that 100% confirmed or will the X800 segment have a couple of additional SKUs thrown into the line-up, eg. 8800 XTX / 8850 XT/XTX.

Its a damn shame AMD ditched the 8900s this generation - I finally warmed up to AMD cards since 7000-series and when the temptation was strong AMD decided to close the door. :banghead:

There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance

Forget ray tracing for a moment, if raster falls substantially short of a 7900 XTX/4080 SUPER, i'd be pissed! This is the bar of performance i'm looking to achieve with my next GPU upgrade without shooting north of £800. The 7900 XTX nicely fits the bill but power consumption is a put-off. Looking forward to seeing how the 8800XT vs Nvidias 50-series mid-tier plays out (with a 16GB precondition).

£800 is a boat load of cash for a gaming GPU, if i'm not getting my moneys worth (my way) I ain't upgrading to the new and fancy.
 

AcE

New Member
Joined
Dec 3, 2024
Messages
23 (11.50/day)
Forget ray tracing for a moment, if raster falls substantially short of a 7900 XTX/4080 SUPER, i'd be pissed!
No, the big question mark is RT performance not raster, raster is expected at about 7900 XT level minimum, which is enough.
£800 is a boat load of cash for a gaming GPU, if i'm not getting my moneys worth (my way) I ain't upgrading to the new and fancy.
It's rumoured to be around 500-600, 800 is way off, you can buy something better for that.
 
Top