- Joined
- Apr 21, 2010
- Messages
- 579 (0.11/day)
System Name | Home PC |
---|---|
Processor | Ryzen 5900X |
Motherboard | Asus Prime X370 Pro |
Cooling | Thermaltake Contac Silent 12 |
Memory | 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK |
Video Card(s) | XFX RX480 GTR |
Storage | Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD |
Display(s) | Cooler Master GA271 and AoC 931wx (19in, 1680x1050) |
Case | Green Magnum Evo |
Power Supply | Green 650UK Plus |
Mouse | Green GM602-RGB ( copy of Aula F810 ) |
Keyboard | Old 12 years FOCUS FK-8100 |
It just seems like RDNA2 is RDNA1 with mediocre DXR support, tweaked for higher clocks. The gamecache seems to have no effect at lower resolutions and only makes a difference at 4K, which is a bit silly because the 6700XT is already at unplayable framerates in several of the tested games at 4K. Who cares if the cache improves performance by 30% when you're still only getting 24.1 fps?
If you look at some of the heavy factory OC 5700XT models with 2150MHz clock speeds they aren't doing a whole lot worse than the ~2450MHz 6700XT reference. I get the impression that a 1900MHz 6700XT would be close enough to a 5700XT that you'd struggle to see the difference in a side-by-side comparison; You'd question whether there was actually an improvement or whether it was just within margin-of-error.
People are arguing that RDNA2 is a huge architectural leap forward, but to me the results sure look like most of the gained performance over the 5700XT is proportional to the clockspeed, meaning that IPC gains are close to zero and power efficiency takes a massive hit from running at those higher clocks.
Well , It's true because RDNA2 architecture provides great opportunity in game performance which you CAN'T get rdna1.