Friday, August 24th 2018
3D Mark's Time Spy With Raytracing to be Launched by the End of September
(Update: UL has come forward to clarify the way they're integrating Raytracing into their benchmarking suite. You can read the follow-up article here.)
UL (who acquired and is in the process of changing 3D Mark's image to that of its own) has revealed that the new, raytracing-supporting version of their Time Spy high performance and high quality benchmark will be arriving by the end of September.
The new version of the benchmark will be released around the launch of Microsoft's next version of its Windows 10 Operating System, codenamed Redstone 5, and thus will fall in some time after NVIDIA's RTX 20-series launch on September 20th. Here's hoping it will be available in time for comparative reviews on NVIDIA's new family of products, and that some light can be shed on the new series' framerates delivery, and not just their GigaRays/sec capabilities.
Source:
TechSpot
UL (who acquired and is in the process of changing 3D Mark's image to that of its own) has revealed that the new, raytracing-supporting version of their Time Spy high performance and high quality benchmark will be arriving by the end of September.
The new version of the benchmark will be released around the launch of Microsoft's next version of its Windows 10 Operating System, codenamed Redstone 5, and thus will fall in some time after NVIDIA's RTX 20-series launch on September 20th. Here's hoping it will be available in time for comparative reviews on NVIDIA's new family of products, and that some light can be shed on the new series' framerates delivery, and not just their GigaRays/sec capabilities.
70 Comments on 3D Mark's Time Spy With Raytracing to be Launched by the End of September
That BF V RTX demo showed the wood on the gun giving ray traced reflections and that wasn't a very shiny object..
Open water in combo with horrid SSR is my biggest visual distraction in games.
The truth is they want all of it and are sore losers. No one wants them, b/c they can't produce a worthwhile SoC.
The longer this continues, especially since AMD is getting Zen money, the more likely it becomes they get usurped and have nothing to fall back on.
Terms we apply to people, like "sore losers", don't apply to corporations. It really lowers your own argument points when you try to portray them as having people personalities. Just a little tip. :)
"For convexes and trimeshes, it is indeed possible (and recommended) to share the cooked data between all instances."
100 boxes you can destroy with PhysX all precooked data to speed up the process. Just like every other engine, and why games have gotten bigger, a lot of prerendered items, just have to make sure the right color is installed.
If it perfroms OK against the "RTX" cards with RT on it will of course be even more interesting.
As had been mentioned elsewhere, Vega has a lot of horsepower but it is not well utilised a lot of the time. If RT can be done on the e.g. 20 - 25% that goes unused, it may work.
Additionally, I wonder if this "kind of async" the NV presentation showed (the new pipeline) where integer ops are done in parallel means that DXR etc. are implemented where these ops are fully parallel, if that's the case (and the apparent new push for 2 cards hints at it), maybe if you have a 2nd card (e.g. a 2nd Vega) if the s/w can then utilise it better by e.g. putting the needed RT/integer etc. calcs onto the 2nd card - or optimally spreading the various calculations over the total cores to get the right balance for the best performance.
It would seem that's the way things are going.
If this does turn out to be the case, for those wanting more performance, what would a TR with 4 x V56/64s perform like and how much would it cost if 2080ti is 2x the cost of a V64? Obviously either option is expensive and the power bill/cooling could be a factor but less likely if you can afford that kind of system.
Wouldn't it be funny if cards can be given a calculation to do with very little data xfer and the results also don't need much and you could run your own super computer with 10+ cards on oe of those mining boards with 1x slots - or some board made for it with a load of 4x. (This last bit is a bit of fun BTW!).
/ramble
I like it when someone corrects you and you then have to go and brush up on history.
But in analogy. Tesla would be Ageia Physics cards?
edit: Not that anyone's doing this here. It's just a sidenote… about all of the weird conspiracies surrounding Tesla and people promoting magical Tesla crap. "Woo" I think is the term on rational wiki.
www.geeks3d.com/forums/index.php?topic=5077.0
I guess Vega's market penetration was so appalling they chose not to bother.
DX 12 is a tech for developers: it changes a lot about how they work and it generally makes writing games much harder and costlier. But the actual impact on how games look isn't that great. So it didn't get traction in the business.
Ray tracing is a huge improvement for customers. If it makes to mainstream titles, we'll be in a new era of gaming. Well... at least those of us, who care more about visuals than about fps. :)
I mean, any game great can "look" great if you didn't care about fps. :D
60fps is great, but lets be honest: 30fps is still fine.
fps below 30 takes some fun away, but man... I remember years of Intel HD gaming, when I played games like Oblivion on low settings - with frames dropping to maybe 10 or 15 during flashy fights. And it was *so much fun* nevertheless.
Looking at the RTX samples, it's a huge change. And we know it's the next big thing in gaming, because - let's be honest - 4K@60fps is here and 99% of gamers don't need more. Also bad.
It's not even about the API itself being badly written. It's the concept itself. You have a much more direct control of what's going on in the GPU. So if you spend enough time coding, it works better. But it also means way more complicated code, more differences between platforms, worse porting and so on. It's more dependent on GPU firmware as well...
Older DirectX were pretty simple - even for programmers who only needed to go 3D for occasional project or to have fun.