Incorrect. Like every new and emerging technology, it takes time for software devs to catch up, take advantage of the new features and push the new tech to it's limits. Ray-Tracing is no different. Console development may push the tech further a little faster, but that progress will happen anyway. Console development has always taken it's cues from PC development. Even before the X86 trend began in the 80's, the Z80 CPU in micro PC's came before the game console usage, the 6502 and 68000 were the same. Consoles have always followed the trends of the consumer computer sector of the industry.
These facts are not a bad thing. PC's show an emerging technology and gaming companies look for ways to refine and push it to it's limits. Example, the Nintendo Famicom/NES. The 6502 CPU gave rise to the custom 65C02 in those systems. Nintendo took an existing technology that had become well known, tweaked it a little, and made something truly remarkable with it. Yet that CPU started in general and consumer computers first. Sega did the same thing with the Master System. Even though it wasn't very popular, it's technical specs were customized with great results. The Z80 CPU was repurposed for use in that game system to solid effect.
This trend has never changed and is unlikely to change as the existing development model works very well.
With Doom Eternal, this is a game developed on and for PC as a primary platform. It will very likely however get console release, when the console market catches up to the PC market with release of the new models of game systems that have more current and advanced technology.
And yet the earlier VR systems were console based, and not PC based. And today PSVR gained market share faster than Oculus or Vive, but across all platforms the market share is stalling. Granted, Oculus was first, but it didn't get beyond dev kit status until PSVR entered the market proper, really. I'll predict for you right now: Alyx will flop just as hard as any other PC based VR content. Its a niche. On the console, it was a simple addon with low barrier of entry and it exploded. Now its back to gimmick status. Once the consoles pick this up again proper... just watch.
Its not quite as black and white anymore really, and its not really true the PC always leads the charge. Another one is the HTPC movement. Did the PC start that? I beg to differ... the TV itself, media/settop boxes for TV, and the PS3 really pushed that forward as a possibility: that the TV can be used as a screen in the living room; for streamed content, for internet browsing, and for gaming on
any platform. This does not exclude a few nerds and some open source apps being used on the PC years earlier... but its clear the traction isn't coming from the PC but from other segments. IN-home-streaming? Steam didn't start that... but my PSP could connect to home WIFI and play PS3 based content in the garden, pretty cool stuff back then. And it worked, too.
Another one to think of. Mobile and its developments porting over to PC. Another example of changes that don't originate or iterate from the PC platform but from elsewhere. And in the end they all marry into new products that are, ever more often,
platform independent or cross platform. Its a bigger movement than you seem to realize. Look at the One Windows/Continuum attempt. Look at cloud services available anywhere. The device is increasingly just a gateway and it doesn't matter as much anymore what device you really use. Why do you think Sony and MS are slowly but surely bringing more and more first party content to the PC? They are afraid they're missing out. Exclusives like Horizon Zero Dawn... those are
system sellers. Do the math... selling the game is deemed more vital than selling consoles.
I think reasonably the only area where PC is always leading, is in brute force. And brute force enables us to compute stuff we couldn't before, or without much specific programming/coding effort. Simulations for example are easier to brute force than to code explicitly (example: Crysis physics engine and graphics, versus eg Battlefield's levolution with fixed levels of destruction, the latter being a console release too almost simultaneously). That also applies to graphics. We can just brute force a higher render resolution, or add visual effects on the fly or with plugins. There is power available to do this. Consoles lack that, but incorporate the same tricks by design shortly after, they apply efficiency to the brute forcing and the result is often some sort of plugin that is then ported over to PC games as well... one gen later, those brute forced effects are mainstream. Without the consoles, they'd never really get there in many occasions.
RT is going to be an interesting one, but you can bet AMD has a more specific and streamlined / maybe less accurate version of RT happening on the consoles, perhaps using lower res or ray counts/bounces or a specified set of masks for materials, etc. After all, the chip simply isn't as big and size matters, it always does... something's gotta give there
Anyway... Doom Eternal
GTX 970 is above RX470 4GB and R9 290 according to system requirements.
And they also listed RX480 8GB and 970 4GB for same same setting (minus the texture resolution)
. If RX480 4GB can run the game on high texture then their would be no reason to list RX480 8GB instead of 4GB version
These system requirements make 6 year old GTX 970 looks too good if you ask me.
Really ??
according to system requirements
5.5 year old GTX 970 runs it 1080p high setting (medium texture) 60fps
3.5 year old GTX 1060 6GB and RX480 8GB runs it 1080p high setting 60fps
(assuming these requments are even accurate because most of time they are bullshit)
Only difference is one setting between the video card with large memory and smaller one
GTX 970 already outdated and most people who have it probably replaced it. Basically even if GTX 970 has huge memory, it is not worth it when you have to wait 5 to 6 year just to change one setting. GTX 970 already struggle to get 60fps in a lot of modern game at higher setting due to the lack of GPU power, and larger memory won't make much difference. Texture resolution is the only setting that eats memory without having big impact on GPU speed, but all other setting is going to eat GPU speed before memory become an issue.
Wait. So a 6GB 1060 is still relevant but a 970 with comparable performance is not, because age replaced it (chicken / egg
) ? Last I checked GPUs easily last 7-10 years unless something craps out, and most of that can be repaired (ie a fan).
If the 970 had 4GB of fast VRAM it would be more relevant, if it had 6GB it would be just a slightly slower 980ti and definitely relevant. But you're right anyway: most of the time these requirements are bullshit indeed! What we see in practice is that 3GB cards struggle whereas a 4GB card can run the same FPS, but does it without sudden framedrops or stutter if the VRAM is taxed. So yes, you will see the same FPS. But the delivery will be different on the lower VRAM card, for sure. Been there, done that, seen it too often and you can throw a hundred benches at me but what you really need is 95th percentiles on same settings at similar FPS to see the truth. Or better yet, first hand experience.
Another thing you likely won't get at these settings is fixed 60 FPS. You get averages. So that muddies things a bit too... you can bet all cards drop below 50 at times. The question is how badly and how often...
but all other setting is going to eat GPU speed before memory become an issue.
Textures are the most elementary thing though to improve visual quality. Another big one is shadowmap resolution, you need VRAM for that too. And different assets/actors, like a hundred different outfits on screen, also are mostly VRAM. You'd be surprised how many things are affected. GTA V has a beautiful way of showing you the impact of each setting, check that out sometime.