• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Big Navi GPU Features Infinity Cache?

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,591 (0.97/day)
As we are nearing the launch of AMD's highly hyped, next-generation RDNA 2 GPU codenamed "Big Navi", we are seeing more details emerge and crawl their way to us. We already got some rumors suggesting that this card is supposedly going to be called AMD Radeon RX 6900 and it is going to be AMD's top offering. Using a 256-bit bus with 16 GB of GDDR6 memory, the GPU will not use any type of HBM memory, which has historically been rather pricey. Instead, it looks like AMD will compensate for a smaller bus with a new technology it has developed. Thanks to the new findings on Justia Trademarks website by @momomo_us, we have information about the alleged "infinity cache" technology the new GPU uses.

It is reported by VideoCardz that the internal name for this technology is not Infinity Cache, however, it seems that AMD could have changed it recently. What does exactly you might wonder? Well, it is a bit of a mystery for now. What it could be, is a new cache technology which would allow for L1 GPU cache sharing across the cores, or some connection between the caches found across the whole GPU unit. This information should be taken with a grain of salt, as we are yet to see what this technology does and how it works, when AMD announces their new GPU on October 28th.



View at TechPowerUp Main Site
 
Joined
May 13, 2015
Messages
632 (0.18/day)
Processor AMD Ryzen 3800X / AMD 8350
Motherboard ASRock X570 Phantom Gaming X / Gigabyte 990FXA-UD5 Revision 3.0
Cooling Stock / Corsair H100
Memory 32GB / 24GB
Video Card(s) Sapphire RX 6800 / AMD Radeon 290X (Toggling until 6950XT)
Storage C:\ 1TB SSD, D:\ RAID-1 1TB SSD, 2x4TB-RAID-1
Display(s) Samsung U32E850R
Case be quiet! Dark Base Pro 900 Black rev. 2 / Fractal Design
Audio Device(s) Creative Sound Blaster X-Fi
Power Supply EVGA Supernova 1300G2 / EVGA Supernova 850G+
Mouse Logitech M-U0007
Keyboard Logitech G110 / Logitech G110
I've been stuck on a 290X for a few years now and I can't wait to get the 6900XT or if they make the liquid cooled version 6900XTX. Now that AMD has beaten back the anti-capitalist crony Intel and made enough money to really push R&D:
  • The drivers are rumored to be solid for this release.
  • There will actually be stock because unlike Nvidia they're not trying to artificially drive up prices.
  • It's not going to be a watt-sucking heat-producing beast.
  • I'll finally stop running out of video memory (browsers use GPU memory).
 
Joined
Jun 24, 2020
Messages
93 (0.06/day)
1gb cache = from 512bit to 128bit bw

wow

how about 6gb cache we could not need
 
Joined
Sep 17, 2014
Messages
22,452 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Good comedy, this

Fans desperately searching for some argument to say 256 bit GDDR6 will do anything more than hopefully get even with a 2080ti.

History repeats.

Bandwidth is bandwidth and cache is not new. Also... elephant in the room.... Nvidia needed expanded L2 Cache since Turing to cater for their new shader setup with RT/tensor in them...yeah, I really wonder what magic Navi is going to have with a similar change in cache sizes... surely they won't copy over what Nvidia has done before them like they always have right?! Surely this isn't history repeating, right? Right?!

:lovetpu:



I've been stuck on a 290X for a few years now and I can't wait to get the 6900XT or if they make the liquid cooled version 6900XTX. Now that AMD has beaten back the anti-capitalist crony Intel and made enough money to really push R&D:
  • The drivers are rumored to be solid for this release.
  • There will actually be stock because unlike Nvidia they're not trying to artificially drive up prices.
  • It's not going to be a watt-sucking heat-producing beast.
  • I'll finally stop running out of video memory (browsers use GPU memory).

Let's revisit those assumptions post launch ;) That'll be fun, too. I'll take a bet... drivers will need hotfixing, which will likely come pretty late or creates new issues along the way (note: Nvidia has fallen prey to this just as well, this alone should say enough); things will be out of stock shortly after launch, its going to suck an easy 250-300W just as well, and yes, you do have 16GB on the top model.

If I'm wrong, I'll buy it :p
 
Last edited by a moderator:
Joined
Apr 29, 2018
Messages
129 (0.05/day)
Good comedy, this

Fans desperately searching for some argument to say 256 bit GDDR6 will do anything more than hopefully get even with a 2080ti.

History repeats.

Bandwidth is bandwidth and cache is not new. Also... elephant in the room.... Nvidia needed expanded L2 Cache since Turing to cater for their new shader setup with RT/tensor in them...yeah, I really wonder what magic Navi is going to have with a similar change in cache sizes... surely they won't copy over what Nvidia has done before them like they always have right?! Surely this isn't history repeating, right? Right?!

:lovetpu:

View attachment 170974



Let's revisit those assumptions post launch ;) That'll be fun, too. I'll take a bet... drivers will need hotfixing, which will likely come pretty late or creates new issues along the way (note: Nvidia has fallen prey to this just as well, this alone should say enough); things will be out of stock shortly after launch, its going to suck an easy 250-300W just as well, and yes, you do have 16GB on the top model.

If I'm wrong, I'll buy it :p
You have to be a special kind of stupid to think their top card will only match the 2080ti considering the 2080ti is 50% faster than the 5700xt. It does not take a genius to realize that doubling the cores of the 5700xt, increasing IPC, and running higher clocks would result in a MUCH higher gain than 50%. FFS even the XBOX series X has a gpu as fast or faster than the 2080 super and the 6900xt will be a hell of a lot bigger gpu.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,579 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
You have to be a special kind of stupid to think their top card will only match the 2080ti considering the 2080ti is 50% faster than the 5700xt. It does not take a genius to realize that doubling the cores of the 5700xt, increasing IPC, and running higher clocks would result in a MUCH higher gain than 50%. FFS even the XBOX series X has a gpu as fast or faster than the 2080 super and the 6900xt will be a hell of a lot bigger gpu.

It's less about being stupid and more about managing expectations. High tier AMD cards have burned people in the past because they expected too much. The only sensible thing to do is to wait for reviews.
 
Joined
Sep 6, 2013
Messages
3,333 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I don't think cache can replace bandwidth. Especially when games ask for more and more VRAM. I might be looking at it the wrong way and the next example could be wrong, but, Hybrid HDDs NEVER performed as real SSDs.

I am keeping my expectations really low after reading about that 256bit data bus.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Regardless of the veracity of this, there is definitely something weird about the rumored specifications for these GPUs. 256-bit and 192-bit bus widths for a high-end GPU in 2020 with no new tricks to counteract this would be a significant bottleneck. And AMD obviously knows this. They do, after all, design GPUs for a living. They have the resources to, say, make a 512-bit test chip + PCB and benchmark it with varying numbers of memory controllers enabled, identifying when and how bottlenecks appear. And while 512-bit buses aren't really commercially viable (huge, hot, expensive, and at that point HBM is a better alternative at likely the same price), 384-bit buses are. So if they've chosen to go 256-bit for their highest end GPU, there has to be some reason for it.
 
Joined
Nov 11, 2016
Messages
3,415 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
You have to be a special kind of stupid to think their top card will only match the 2080ti considering the 2080ti is 50% faster than the 5700xt. It does not take a genius to realize that doubling the cores of the 5700xt, increasing IPC, and running higher clocks would result in a MUCH higher gain than 50%. FFS even the XBOX series X has a gpu as fast or faster than the 2080 super and the 6900xt will be a hell of a lot bigger gpu.

Let say 6900XT is 20-30% faster than 2080 Ti in "specific" rasterization workload that doesn't require massive bandwidth, but slower than 2080 Ti in Ray Trace workload, does it mean 6900XT is a faster GPU ?
"But you don't need Ray Tracing" is not an excuse for >500usd GPU.
Before you say that there are other API alternative for Ray Tracing, not having dedicated RT cores will just hammer performance, just look at Crysis Remastered as an example (the game can leverage the RT cores)

 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Fans desperately searching for some argument to say 256 bit GDDR6 will do anything more than hopefully get even with a 2080ti.

I've noticed you are quite dead set on saying some pretty inflammatory and quite stupid things to be honest as of late. What's the matter ?

A 2080ti has 134% the performance of a 5700XT. The new flagship is said to have twice the shaders, likely higher clock speeds and improved IPC. Only a pretty avid fanboy of a certain color would think that such a GPU could only muster some 30% higher performance with all that. GPUs scale very well, you can expect it to be between 170-190% the performance of a 5700XT.

Bandwidth is bandwidth and cache is not new.

Caches aren't new, caches as big as the ones rumored are a new thing. I should also point out that bandwidth and the memory hierarchy is completely hidden away from the GPU cores, in other words, whether it's reading at 100GB/s from DRAM or at 1 TB/s from a cache, it doesn't care, it's just operating on some memory at an address as far as the GPU core is concerned.

Rendering is also an iterative process where you need to go over the same data many times a second, if you can keep for example megabytes of vertex data in some fast memory close to the cores that's a massive win.

GPUs hide very well memory bottlenecks by scheduling hundreds of threads, another thing you might have missed is that over time the ratio of GB/s from DRAM per GPU core has been getting lower and lower. And somehow performance keeps increasing, how the hell does that work if "bandwidth is bandwidth" ?

Clearly, there are ways of increasing the efficiency of these GPU such that they need less DRAM bandwidth to achieve the same performance, this is another one of those ways. By your logic, we must have had GPUs with tens of TB/s by now because otherwise the performances wouldn't have gone up.

  • There will actually be stock because unlike Nvidia they're not trying to artificially drive up prices.

They wont have much stock, most wafers are going to consoles.

  • It's not going to be a watt-sucking heat-producing beast.

While performance/watt must have increased massively, perhaps even over Ampere, the highest end card will still be north of 250W.
 
Last edited:
Joined
Mar 13, 2012
Messages
278 (0.06/day)
I don't think cache can replace bandwidth. Especially when games ask for more and more VRAM. I might be looking at it the wrong way and the next example could be wrong, but, Hybrid HDDs NEVER performed as real SSDs.

I am keeping my expectations really low after reading about that 256bit data bus.

Why do you think we have cache in CPU, GPU and SSD + more.

Because it works and it does replace bandwidth, information that the GPU uses repeatedly is stored in and fetched from cache and thus does not have to travel through the memory bus each time. Therefore the memory bandwidth saved by using cache can instead be used for other information. So a 256-bit bus with a large very effective cache equals MORE MEMORY BANDWITH, Nvidia already uses this system on all their cards.
 
Joined
Feb 3, 2017
Messages
3,756 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
A 2080ti has 134% the performance of a 5700XT.
At 1080p. At 1440p, its 142% and at 2160p its 152%.
More notably though, 3080 is twice as fast.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
At 1080p. At 1440p, its 142% and at 2160p its 152%.

Probably you're right, I went of the comparison tool thingy when you browse different GPU that one says the 2080ti is 134% the performance of a 5700XT.

Based on TPU review data: "Performance Summary" at 1920x1080, 4K for 2080 Ti and faster.
 
Joined
Feb 11, 2009
Messages
5,555 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
It Always pains me to see people overhyping products, it can pretty much only lead to dissapointment.
That said, lets not forget this GPU was pretty much made with the help of Sony and Microsoft because of their consoles using RDNA2, that is a lot of (smart) people working on a product, so I do have faith that it will be good.

And personally I care little for "beating" Nvidia in "performance".
If it delivers good frames, while going ez on the powerconsumption and while costing, finally again, a reasonable amount of money and not the obscene prices being asked as of late, its a winner in my book.

Heck I would REALLY love it if we had a new RX460/470/480 moment, where all games could be lifted up, where everyone could upgrade and get with the times.

This would also be really good for the evolution/implementation of Ray Tracing, the industry can only really make use of that if the world can use it.
 
Joined
Jun 27, 2019
Messages
2,109 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
And personally I care little for "beating" Nvidia in "performance".
If it delivers good frames, while going ez on the powerconsumption and while costing, finally again, a reasonable amount of money and not the obscene prices being asked as of late, its a winner in my book.

Heck I would REALLY love it if we had a new RX460/470/480 moment, where all games could be lifted up, where everyone could upgrade and get with the times.

This would also be really good for the evolution/implementation of Ray Tracing, the industry can only really make use of that if the world can use it.

Yup, this is what I would also love to see and what I mainly care about when upgrading.

Those RX cards were a godsend for me, it was a solid upgrade from my previous card w/o breaking the bank/my wallet.

Looking at the prices lately, most likely my only option will be the second hand market again if I want the same performance uplift as last time. 'went from a GTX 950 to RX 570'
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
For the sake of comparison, the RTX 2080Ti has exactly twice as many shaders as the RTX 2060 Super with very similar Real-World clocks and performs about 63.3% better at 4K according to TPU's average framerate in 20+ games.
Based on Xbox Series X performance scaling over the X1X it doesn't seem like RDNA2 has much in the way of IPC improvements over RDNA.
So with similar clocks I expect the top-end 80CU RDNA2 to be 55-65% faster than the 5700XT depending on the resolution. (Assuming there is no bandwidth bottleneck)
But as we all know RNDA2 will have noticeably higher clocks than RDNA1, I expect the average clocks of the 80CU part to be in the 2-2.1GHz range which is a decent 10-13% above the 5700XT, assuming semi-linear scaling, this clock boost alone will put RDNA2 10-12% above RDNA1, now with addition of that massive shader count increase It's probably reasonable to expect the top-end RDNA2 to be 75-85% faster than the 5700XT as Vya Domus predicted.

Expecting flagship RDNA2 to be only as fast as a 3070/2080Ti is not realistic, as it will probably beat them both comfortably.
 
Last edited:
Joined
Apr 12, 2013
Messages
7,532 (1.77/day)
When did X1X have RDNA based GPU :wtf:

Also, don't extrapolate RDNA2 performance based on console numbers. They're not exactly comparable, it's more like comparing cashews to figs.
 
Last edited:

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Joined
Apr 12, 2013
Messages
7,532 (1.77/day)
Based on Xbox Series X performance scaling over the X1X it doesn't seem like RDNA2 has much in the way of IPC improvements over RDNA.
You said this, how can it be interpreted any differently?
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
You said this, how can it be interpreted any differently?

I agree, that part of my comment was a bit confusing but I didn't mean The X1X has RDNA.
just the Real-World performance increase didn't suggest higher IPC than RDNA1 to me, based on how RDNA performs in comparison to the console.
 
Joined
Aug 5, 2019
Messages
808 (0.42/day)
System Name Apex Raptor: Silverback
Processor Intel i9 13900KS Allcore @ 5.8
Motherboard z790 Apex
Cooling LT720 360mm + Phanteks T30
Memory 32GB @8000MT/s CL36
Video Card(s) RTX 4090
Storage 990 PRO 4TB
Display(s) Neo G8 / C1 65"
Case Antec Performance 1
Audio Device(s) DT 1990 Pro / Motu M2
Power Supply Prime Ultra Titanium 1000w
Mouse Scimitar Pro
Keyboard K95 Platinum
You say Infinity Cache, i hear "we have chilplets on GPU's now"
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
me love to read comments!

popcorn gif.gif
 

bug

Joined
May 22, 2015
Messages
13,779 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Ok, who the hell calls Navi2 "Big Navi"?
Big Navi was a pipe dream of AMD loyalists left wanting for a first gen Navi high-end card.
 
Top