• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon VII Retested With Latest Drivers

Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
Prey (DX11, NVIDIA Bias, 2017 )
Oh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
Counting Prey as favouring AMD (i.e. it could be even worse) it's:
AMD: 4
Nvidia: 14
undecided: 4

AMD: 4/18 ~= 22%
Nvidia: 14/18 ~= 78%
which is basically the market share these companies have in discrete GPU.

Do you think it should be 50:50? Or what? And why?
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Really, all they should do is focus on optimizing for the major game engines.
No, not at all. They should start focusing on optimizing the driver in general, not do workarounds to "cheat" benchmarks.

Many have misconceptions about what optimizations really are. Games are rarely specifically optimized for targeted hardware, and likewise drivers are rarely optimized for specific games in their core. The few exceptions to this are cases to deal with major bugs or bottlenecks.

Games should not be written for specific hardware, they are written using vendor-neutral APIs. Game developers should focus on optimizing their engine for the API, and driver developers should focus on optimizing their driver for the API, because when they try to cross over, that's when things starts to get messy. When driver developers "optimize" for games, they usually manipulate general driver parameters and replace some of the game's shader code, and in most cases it's not so much optimization as them trying to remove stuff without you seeing the degradation in quality. Games have long development cycles and are developed and tested against API specs, so it's obvious problematic when suddenly a driver diverges from spec and manipulate the game, and generally this causes more problems than it solves. If you have ever experienced a new bug or glitch in a game after a driver update, then you now know why…

This game "optimization" stuff is really just about manipulating benchmarks, and have been going on since the early 2000s. If only the vendors spend this effort on actually improving their drivers instead, then we'll be far better off!

Isn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?
Not at all. Many AAA titles are developed exclusively for consoles and then ported to PC, if anything there are many more games with a bias favoring AMD than Nvidia.

Most people don't understand what causes games to be biased. First of all, a game is not biased just because it scales better on vendor A than vendor B. Bias is when a game has severe bottlenecks or special design considerations, either intentional or "unintentional", that gives one vendor a disadvantage it shouldn't have. When some games scale better on one vendor and some other games scale better on another vendor isn't a problem by itself, games are not identical, and different GPUs have various strengths and weaknesses, so we should use a wide selection to determine real world performance. Significant bias happens when a game is designed around one specific feature, and the game scales badly on different hardware configurations. A good example of this is games which are built for consoles but doesn't really scale well with much more powerful hardware. But in general, games are much less biased than most people think, and just because the benchmark doesn't confirm your presumptions doesn't mean the benchmark is biased.

AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.
Over the past 10+ years, every generation have improved ~5-10% within their generation's lifecycle.
AMD is no better at driver improvements than Nvidia, this myth needs to die.

hahaha....
but I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU.
FUD which has been disproven several times. I don't belive Nvidia have ever intentionally sabotaged older GPUs.
 
Joined
Feb 3, 2017
Messages
3,753 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
The game list is wrong. The actual games tested in Radeon VII review along with year, API and engine name are:
  • 2017 - DX11 - Assassin's Creed Odyssey (AnvilNext 2.0)
  • 2018 - DX11 - Battlefield V (Frostbite 3)
  • 2016 - DX11 - Civilization VI (Firaxis)
  • 2018 - DX11 - Darksiders 3 (Unreal Engine 4)
  • 2016 - DX12 - Deus Ex: Mankind Divided (Dawn)
  • 2017 - DX11 - Divinity Original Sin II (Divinity Engine)
  • 2018 - DX11 - Dragon Quest XI (Unreal Engine 4)
  • 2018 - DX11 - F1 2018 (EGO Engine 4.0)
  • 2018 - DX11 - Far Cry 5 (Dunia)
  • 2017 - DX11 - Ghost Recon Wildlands (AnvilNext)
  • 2015 - DX11 - Grand Theft Auto V (RAGE - Rockstar Advanced Game Engine)
  • 2017 - DX11 - Hellblade: Senua's Sacrifice (Unreal Engine 4)
  • 2018 - DX11 - Hitman 2 (Glacier 2.0)
  • 2018 - DX11 - Just Cause 4 (Apex)
  • 2018 - DX11 - Monster Hunter World (MT Framework)
  • 2017 - DX11 - Middle-earth: Shadow of War (LithTech)
  • 2015 - DX11 - Rainbow Six: Siege (AnvilNext)
  • 2018 - DX12 - Shadow of the Tomb Raider (Foundation)
  • 2018 - DX12 - Strange Brigade (Asura Engine)
  • 2015 - DX11 - The Witcher 3 (REDengine 3)
  • 2017 - Vulkan - Wolfenstein II (idTech6)
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
The game list is wrong. The actual games tested in Radeon VII review along with year, API and engine name are:
  • 2017 - DX11 - Assassin's Creed Odyssey (AnvilNext 2.0)
  • 2018 - DX11 - Battlefield V (Frostbite 3)
  • 2016 - DX11 - Civilization VI (Firaxis)
  • 2018 - DX11 - Darksiders 3 (Unreal Engine 4)
  • 2016 - DX12 - Deus Ex: Mankind Divided (Dawn)
  • 2017 - DX11 - Divinity Original Sin II (Divinity Engine)
  • 2018 - DX11 - Dragon Quest XI (Unreal Engine 4)
  • 2018 - DX11 - F1 2018 (EGO Engine 4.0)
  • 2018 - DX11 - Far Cry 5 (Dunia)
  • 2017 - DX11 - Ghost Recon Wildlands (AnvilNext)
  • 2015 - DX11 - Grand Theft Auto V (RAGE - Rockstar Advanced Game Engine)
  • 2017 - DX11 - Hellblade: Senua's Sacrifice (Unreal Engine 4)
  • 2018 - DX11 - Hitman 2 (Glacier 2.0)
  • 2018 - DX11 - Just Cause 4 (Apex)
  • 2018 - DX11 - Monster Hunter World (MT Framework)
  • 2017 - DX11 - Middle-earth: Shadow of War (LithTech)
  • 2015 - DX11 - Rainbow Six: Siege (AnvilNext)
  • 2018 - DX12 - Shadow of the Tomb Raider (Foundation)
  • 2018 - DX12 - Strange Brigade (Asura Engine)
  • 2015 - DX11 - The Witcher 3 (REDengine 3)
  • 2017 - Vulkan - Wolfenstein II (idTech5)
Isn't Wolfenstein id7 or 6+ ? I remember doom was id6 and then devs said that wolfenstein was a big technical advancement over that.Supports half precision and variable rate shading.
 
Last edited by a moderator:
Joined
Mar 31, 2012
Messages
860 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X | AMD Ryzen 9 9950X
Motherboard QUANTA | ASUS Crosshair VII Hero | MSI MEG ACE X670E
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3400(OC) CL14@1.38v | Fury Beast 64 Gb CL30
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus | TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 15,5" / 27" /34"
Case Black & Grey | Phanteks P400S | O11 EVO XL
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W | FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
This is an accusation you could take to court and become a millionaire. Can you prove it?

accusation? really?
I tested various driver version to my gpu and get it benched. pfftt..
 
Joined
Feb 3, 2017
Messages
3,753 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Looking at the list of games:
- Assassin's Creed, Battlefield, Civilization, Far Cry, GTA, Just Cause, Tomb Raider, Witcher and Wolfenstein need to be in the list as the latest iteration of a long-running game series along with its engine. Same applies to Hitman and possibly F1.
- Strange Brigade as a game is one-off but its engine in a newer iteration fo the one behind Sniper Elite 4 which is one of the best DX12 implementations to date.
- Divinity: Original Sin 2, Monster Hunter and Shadow of War are bit of one-offs as relevant and popular games running unique engines.
- Hellblade is a game that is artistically important and actually has a good implementation of Unreal Engine 4.
- R6: Siege is unique case as despite its release year it is current and competitive game that is fairly heavy on GPUs.
- Deus Ex: Mankind Divided is a bit of a concession to AMD and DX12. This is a modified version of same Glacier 2 engine that is behind Hitman games.
- I am not too sure about the choice or relevance of Darksiders 3, Dragon Quest XI and Wildlands. Latest big UE4 releases and one of UbiSoft's non Assassin's Creed openworld games?

It is not productive to color games based on whether they use Nvidia GameWorks. The main problem with GameWorks as far as NVidia vs AMD is concerned was that it is closed source, making it impossible for AMD to optimize for it if needed. GameWorks has been open source since 2016 or so. AMD does not have a branded program in the same way, GPUOpen and tools-effects provided in it are non-branded but are present in a lot of games.
Isn't Wolfenstein id7 or 6+ ? I remember doom was id6 and then devs said that wolfenstein was a big technical advancement over that.Supports half precision and variable rate shading.
Wolfenstein II is idTech6. Fixed. Thanks.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.12/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Good to see they sorted things out. Would have been nice to have this upfront considering the architecture isn't exactly new or anything.

it seems to me that nvidea make better graphics card than amd which leaves those in camp having a difficult time justifying exactly why they are in red camp..

The real problem is that anyone cares. Grow up and move in. (Not directed at you.)

I tested various driver version to my gpu and get it benched.

The also could have introduced an issue accidentally and didn't circle back because it is not current gen.
 
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I don't think there's much in it



*there's no performance degradation for 780ti,there's very slight improvement.
*780ti vs 290x on 2016 drivers - 780ti wins in 18 runs out of 28,290x in 10 out of 28.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,753 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Both AMD and Nvidia have a list of featured games:
https://www.amd.com/en/gaming/featured-games
https://www.nvidia.com/en-us/geforce/games/latest-pc-games/#gameslist

I doubt W1zzard had that in mind or checked it when choosing games but there are 6 games benchmarked from both vendors' featured list and the games were not what I really expected:
AMD: Assassin's Creed Odyssey, Civilization VI, Deus Ex: Mankind Divided, Far Cry 5, Grand Theft Auto V, Strange Brigade.
Nvidia: Battlefield V, Darksiders 3, Divinity Original Sin II, Dragon Quest XI, Monster Hunter World, Shadow of the Tomb Raider.
:)
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
From https://www.techpowerup.com/reviews/AMD/Radeon_VII/

Assassin's Creed Origins (NVIDIA Gameworks, 2017)

Battlefield V RTX (NVIDIA Gameworks, 2018)

Civilization VI (2016)

Darksiders 3 (NVIDIA Gameworks, 2018), old game remaster, where's Titan Fall 2.

Deus Ex: Mankind Divided (AMD, 2016)

Divinity Original Sin II (NVIDIA Gameworks, 2017)

Dragon Quest XI (Unreal 4 DX11, large NVIDIA bias, 2018)

F1 2018 (2018), Why? Microsoft's Forza franchise is larger than this Codemaster game.

Far Cry 5 (AMD, 2018)

Ghost Recon Wildlands (NVIDIA Gameworks, 2017), missing Tom Clancy's The Division

Grand Theft Auto V (2013)

Hellblade: Senuas Sacrif (Unreal 4 DX11, NVIDIA Gameworks)

Hitman 2

Monster Hunter World (NVIDIA Gameworks, 2018)

Middle-earth: Shadow of War (NVIDIA Gameworks, 2017)

Prey (DX11, NVIDIA Bias, 2017 )

Rainbow Six: Siege (NVIDIA Gameworks, 2015)

Shadows of Tomb Raider (NVIDIA Gameworks, 2018)

SpellForce 3 (NVIDIA Gameworks, 2017)

Strange Brigade (AMD, 2018),

The Witcher 3 (NVIDIA Gameworks, 2015)

Wolfenstein II (2017, NVIDIA Gameworks), Results different from https://www.hardwarecanucks.com/for...a-geforce-rtx-2080-ti-rtx-2080-review-17.html when certain Wolfenstein II map exceeded RTX 2080'
Except it’s AC odyssey and it’s Amd but good effort regardless...
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
You just need to stop projecting. It isn't hard.



I mean, just how shameless can it become, seriously?
You got what performance upfront, when 960 beat 780Ti ($699), come again?

AMD perf improves over time, nVidia falls behind not only behind AMD, but behind own newer cards.
As card you bought gets older, NV doesn't give a flying sex act.
It needs quite a twisting to turn this into something positive.

What's rather unusual this time, is AMD being notably worse at perf/$ edge, at least with game list picked up at TP.

290x was slower than 780Ti at launch, but it cost $549 vs $699, so there goes "I get 10% at launch" again.

Kepler architecture aged bad for some reason but maxwell has aged as it should.
1060 and 980 were in the same level of performance back in 2016 and they are in 2019.
Nothing has changed (except for games that need more than 4GB of VRAM).

I don't think someone who spends 700$ on a card even cares about its performance after 3~ years or so, high-end owner needs high-end performance and upgrades sooner than mid-range user in general.
 
Last edited:
D

Deleted member 185088

Guest
I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?
 
Joined
Mar 10, 2015
Messages
3,984 (1.12/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
While all of this is very interesting, I fail to see how it impacts this gpu. New thread maybe? Perhaps some conspiracy can be brought to light from it.
 
Joined
Sep 17, 2014
Messages
22,439 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Kepler architecture aged bad for some reason but maxwell has aged as it should.
1060 and 980 were in the same level of performance back in 2016 and they are in 2019.
Nothing has changed (except for games that need more than 4GB of VRAM).

Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled again with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is :toast:. But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.
 
Last edited:
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I've heard of some sort of undervolting which improves the card's thermals greatly, the whole process seems very easy, why no one is bothering to use it?
They are... but not in reviews... the majority of people don't bother. There is also the point of, why should anyone have to do this in the first place?
 
Joined
Nov 24, 2017
Messages
853 (0.33/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
Low quality post by THANATOS

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled again with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is :toast:. But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.

True and untrue.
It's not just VRAM, yeah VRAM requirements have risen significantly since then but don't forget you can easily remove the VRAM bottleneck by lowering the Texture quality.
970 was slower than the 780Ti at launch but it's not the case today, it's ahead actually, even in cases where the VRAM isn't a limiting factor.
 
Last edited:
Joined
Nov 24, 2017
Messages
853 (0.33/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
True and untrue.
It's not just VRAM, yeah VRAM requirements have rised significantly since then but don't forget you can easily remove the VRAM bottleneck by lowering the Texture quality.
970 was slower than the 780Ti at launch but it's not the case today, it's ahead actually, without VRAM being a limiting factor.
Pay $350+ just to play with lower Texture quality!!!!
I have a better advice, Buy a(or two) console.
 
Low quality post by M2B

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Pay $350+ just to play with lower Texture quality!!!!
I have a better advice, Buy a(or two) console.

What the hell are you smoking?
 
Joined
Sep 17, 2014
Messages
22,439 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But yeah, its not just VRAM, but if you speak of aging wrt the 3GB cards, we've also seen the introduction of delta compression and much improved GPU Boost for Nvidia starting with Maxwell, while AMD was busy rebranding everything. Maxwell was that crucial moment where they actually gave up and created the gaping hole in performance and perf/watt we're still looking at today.
 
Last edited by a moderator:
Joined
Mar 10, 2014
Messages
1,793 (0.46/day)
Kepler aged badly because of VRAM, and that is exactly where AMD had more to offer in the high-end at the time.

They had a 7970 with 3GB VRAM to compete with GTX 670/680 2GB.
And later they had a 290X with 4GB VRAM to compete with GTX 780/780ti 3GB.

At the same time, the mainstream res started slowly moving to 1440p as Korean IPS panels were cheap overseas and many enthusiasts imported one. This heavily increased VRAM demands, alongside the consoles being released with 6GB to address, which meant mainstream would quickly move to higher VRAM demands, and it happened across just 1,5 generation of GPUs, even in the Nvidia camp the VRAM almost doubled across the whole stack, and then doubled again with Pascal. That is why people are liable to think AMD cards 'aged well' and Nvidia cards lost performance over time. This culminated in the release of the 3.5GB 'fast' VRAM GTX 970. That little bit of history ALSO underlines why AMD now releases a 16GB HBM card. They are banking on the idea that people THINK it might double again over time, that is why you can find some people advocating the 16GB as a good thing for gaming. And of course the expenses of having to alter the card.

If any supporter of Red needed confirmation bias, there it is :toast:. But it doesn't make it any less of an illusion that Nvidia drivers handicap performance over time.

Not only that, it does not have tiled raster either so memory bandwidth became problem to it too. I would be interested to see original Kepler Titan added to tpu benchmark. Performance should be close to rx 570/gtx1060 3GB model at 1080p. @W1zzard still has one? One interesting bit though there were 6GB gtx 780s too, gtx780ti was always 3GB as 6GB was the Titan black... But yeah we are going way off topic now.

Radeon VII has a lot vram only weakness in memory subsystem is it has quite low ROPs count. In normal tasks that is more than enough, but using MSAA can really tank the performance. Luckily for AMD MSAA is dying breed, fewer and fewer games are supporting that AA method anymore.
 
Joined
Jun 18, 2015
Messages
575 (0.17/day)
W1zzard says:
I'd focus on optimizations for UE4 next.

Nvidia is working closely with ue4 right now. I don't think AMD would risk working on optimizations on ue4 then have all the labor gone by a patch that would let AMD cards fall behind again. Just like Nvidia did with the dx11 HAWX game.
 
Low quality post by cucker tarlson
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
god damn this thread is just too fun of a read for those who enjoy conspiracy threories.
 
Top