Friday, August 16th 2019

Assetto Corsa Competizione Dumps NVIDIA RTX
Assetto Corsa Competizione, the big AAA race simulator slated for a September release, will lack support for NVIDIA RTX real-time raytracing technology, not just at launch, but even the foreseeable future. The Italian game studio Kunos Simulazioni in response to a specific question on the Steam Community forums confirmed that the game will not receive NVIDIA RTX support.
"Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation," said the developer, in response to a question about NVIDIA RTX support at launch. This is significant, as Assetto Corsa Competizione was one of the posterboys of RTX, and featured in the very first list by NVIDIA, of RTX-ready games under development.
Source:
Darth Hippious (Steam Community)
"Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation," said the developer, in response to a question about NVIDIA RTX support at launch. This is significant, as Assetto Corsa Competizione was one of the posterboys of RTX, and featured in the very first list by NVIDIA, of RTX-ready games under development.
91 Comments on Assetto Corsa Competizione Dumps NVIDIA RTX
It was obvious right after RTX launch. Initial comments on forums like this one have shown that gamers don't understand how rendering works and that ray tracing has been around for decades. They blamed Nvidia for inventing it and so on.
And now, almost a year later, people still don't get that RTX is just a hardware RT implementation from Nvidia - just like any GPU architecture is basically a hardware implementation of pixel shading etc. All game graphics can be done on CPUs - using open-source APIs. But I doubt gamers would praise the resulting fps...
Maybe RTX API will vanish as open APIs catch up. Maybe it will stay and let developers write more optimal code for Nvidia (just like CUDA vs OpenGL). That's really not that important.
RTX cores will stay and AMD's hardware implementation will join soon enough. That's it. 10 years from now no one will remember all this nonsense. RTRT games will become a standard and work on all modern GPUs. And yet CUDA caught on - despite the initial target audience being STEM scientists, who are usually advocating open-source solutions. It was that good. And it became widely adopted among millions of coders.
Can you give me one good reason why RTX would fail, when it only needs a handful of gaming studios to adapt? And basically all games are closed-source, so it's not like they care about that aspect as much as you do. :-)
There are places in which it does add to the picture and atmosphere. But that's the problem; it feels like a trick as such. Not as something that's supposed to be there.
Anyhow, I was speaking generally. There can and will be exceptions.
Actually you do all the time. It's a game - you don't know if you'll like it. You may read a review and get a general idea whether it's good or not, but your subjective impression may differ.
Same goes for movies, books, food in restaurants etc.
With hardware you have the luxury of having benchmarks that tell you how it performs. And samples give you a very good idea of what to expect in image quality. So before buying you can fairly easily say if RTRT is worth the premium.
Adoption is a different matter. And here's the thing. If you're a casual gamer without brand preference and not involved in brand wars, by now you should be fairly convinced RTRT is going to be mainstream pretty soon. Nvidia can accelerate it. Next get consoles will. AMD will as well. Most big game studios either offer/develop it already or openly admit it's coming.
But among hardcore AMD fans (i.e. anti Nvidia) it's still a common theory that RTRT will fail - just because the other company started the revolution. It was strong initially, but it's still strong now - when we know that in 2-3 years every new gaming PC and console will offer hardware acceleration. That's not true as well. Again: very biased opinion, but this time not because of your brand loyalty but general approach to gaming.
People on this forum tend to game a lot and play many titles. I expect you have a few dozen in your Steam library, right? :)
This is NOT how mainstream gaming looks. The whole gaming business is focused around a small set of very popular franchises: GTA, TES, Diablo, Warcraft, FIFA, CoD, Battlefield, Assassin's Creed, Tomb Raider, Fallout, The Witcher, Civilization, Gran Turismo - things like that.
Nvidia doesn't need to win all game developers. It's enough to get RTRT support in ~half of the top series. At that point almost every gamer will have at least one game that benefits from this tech - often the one he plays the most.
Next year consoles will start supporting RTRT so this will really become a standard. Nvidia's strategy assumes that RTRT will become a key selling point in GPUs and AMD's first implementation will be rushed and lacking. This will let them dominate sales for another 2-3 years - even if AMD catches up on performance and efficiency (thanks to Arcturus, 7nm EUV or divine intervention). Where did you get that "50%" from?
CUDA is the standard and it dominates GPGPU. There's basically nothing else. The only reason for people to use OpenCL in scientific / engineering GPGPU is when the program has to run on CPUs as well. And that's fairly rare.
Gamers underestimate CUDAs popularity - likely simply because they have no idea what they're talking about (why would they?).
Ask a scientist. You'll see I'm right. :-)
Something's gotta give in this balancing act and its not always going to be pretty. So you can safely say truly reliable RT that truly adds to the experience, is quite far away still.
I don't think people (me included, and I stand / stood corrected on that one) are saying RT is never going to happen. I think many people have a bit of a rosy outlook on how these technologies evolve. RT right now is a complete black box wrt perf/dollar and performance within the product stacks. Nobody has a clue what ten gigarays even means for their image quality. Even AMD hasn't said a single thing about what's to be expected, and Nvidia's implementations are all over the place, between tech demo and in-game. We also merely think a separate box of transistors and the whole denoising affair (there is still quite a bit of 'brute forcing' involved, more is rendered than we are seeing) is the best practice right now, but it really doesn't have to be. We've seen many technologies slimmed down or altered radically to make them 'fit' in the pipeline. And that pipeline is still raster-based. As long as that is the case, it will be a slow, uphill climb.
Microsoft made very ambiguous claims about RT and the only thing Sony was explicit about is using ray-tracing for "sound". None of this sounds particularly exciting, if they were really RT capable this stuff would have been plastered all over but instead they are awfully quite and vague.
They had the time, the money and the R&D potential to develop an in-house solution (likely based on FPGA).
It's really not that hard. Student/scientists design simple RT FPGA accelerators for research. You can even download ready to go solutions, like this one:
github.com/justingallagher/fpga-trace Why would they?
Most tech companies are fairly reserved when it comes to future products. RTX was a surprise as well. We knew Nvidia is working on it, but nothing else.
In fact: what do we really know about PS5 (expected in under 12 months)? It'll use AMD. It'll game in 4K and maybe scale to 8K. Anything else?
IMO there's a very good reason for this: console buyers don't give a f... They just want to play games. They don't even care about fps as long as the game looks smooth. And they surely understand ray tracing even less than PC gamers.
Shiny rays will definitely sell and it would be much more of a selling point for young gamers than 4K or 8K. That res relies on the telly mom and dad / kid has in the living room. Or upstairs - which is even less likely.
Last console gen was a real battle comparing the actual specs, and its the reason the PS4 came out swinging right from the start - it was faster. It played BF in a higher res than the X1. Etc. Yes, and I can write my own OS. Now you're just grasping at straws, because neither company has even said the slightest about even planning to do chip design for either console. That's the whole point they work with AMD, the company can make them a custom SoC that has everything they need.
Navi's RT for consoles will be an afterthought, just as the timing of its announcement.
Completely different solution can be implemented but given that performance is not sufficient for full-RT and hybrid solutions will still need integration into rasterization pipeline there is a considerable technical challenge here, both in hardware and in software. Nvidia opened this up with what seems to be a sensible starting solution, this approach is standardized as DXR (notably - a Microsoft thing) and creating a custom alternative for the sake of it does not sound like the best of ideas. At least for Microsoft, Sony consoles are more into proprietary APIs.
Plus, blocky graphics + realistic lighting = :love:
However, support is one thing, doing worthwhile things with it is another. Minecraft looks brilliant to me, because it manages to add some realism into a world that's made out of blocks. I can't put my finger on it, but the whole thing just works. At least for me. And that's what's going to make RTRT take off, not just the ability to run RTRT at a constant 60fps.
docs.unrealengine.com/en-US/Engine/Rendering/RayTracing/index.html