Friday, August 16th 2019

Assetto Corsa Competizione Dumps NVIDIA RTX

Assetto Corsa Competizione, the big AAA race simulator slated for a September release, will lack support for NVIDIA RTX real-time raytracing technology, not just at launch, but even the foreseeable future. The Italian game studio Kunos Simulazioni in response to a specific question on the Steam Community forums confirmed that the game will not receive NVIDIA RTX support.

"Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation," said the developer, in response to a question about NVIDIA RTX support at launch. This is significant, as Assetto Corsa Competizione was one of the posterboys of RTX, and featured in the very first list by NVIDIA, of RTX-ready games under development.
Source: Darth Hippious (Steam Community)
Add your own comment

91 Comments on Assetto Corsa Competizione Dumps NVIDIA RTX

#76
Apocalypsee
RTRT is still in very early stage of its life in gaming, this happened before during early days of pixel/vertex shading when Microsoft introduced it in DirectX. As long as there is a segregation in the product line (Ti-DX8 MX-DX7, RTX-accelerated RTRT GTX-no accelerated RTRT) this won't get proper traction. I still remember early days of pixel shading in game, the differences with it on and off is very little. Look at Metro Exodus below, can you show which one with RTRT on and off? This is IMO, THE BEST RTRT currently have in offer.



Posted on Reply
#77
eidairaman1
The Exiled Airman
ApocalypseeRTRT is still in very early stage of its life in gaming, this happened before during early days of pixel/vertex shading when Microsoft introduced it in DirectX. As long as there is a segregation in the product line (Ti-DX8 MX-DX7, RTX-accelerated RTRT GTX-no accelerated RTRT) this won't get proper traction. I still remember early days of pixel shading in game, the differences with it on and off is very little. Look at Metro Exodus below, can you show which one with RTRT on and off? This is IMO, THE BEST RTRT currently have in offer.



Hence a waste of money and a rtx 2080ti at 1200 is not worth it at all.
Posted on Reply
#78
notb
SteevoThere are other forms of open source Ray tracing that give superior results with power computational overhead.
Name one. ;-)
ZoneDymoI just dont get it, dont we all want real time ray tracing for like...two decades now?
And its finally getting worked on, with a good shot of becoming a thing....and people laugh when a company drops support?
I think people are still confused what's happening.
It was obvious right after RTX launch. Initial comments on forums like this one have shown that gamers don't understand how rendering works and that ray tracing has been around for decades. They blamed Nvidia for inventing it and so on.

And now, almost a year later, people still don't get that RTX is just a hardware RT implementation from Nvidia - just like any GPU architecture is basically a hardware implementation of pixel shading etc. All game graphics can be done on CPUs - using open-source APIs. But I doubt gamers would praise the resulting fps...

Maybe RTX API will vanish as open APIs catch up. Maybe it will stay and let developers write more optimal code for Nvidia (just like CUDA vs OpenGL). That's really not that important.
RTX cores will stay and AMD's hardware implementation will join soon enough. That's it. 10 years from now no one will remember all this nonsense. RTRT games will become a standard and work on all modern GPUs.
R-T-BAnything with a brand attatched to it in name is a dead end. Open, brand neutral standards are the way forward.
And yet CUDA caught on - despite the initial target audience being STEM scientists, who are usually advocating open-source solutions. It was that good. And it became widely adopted among millions of coders.

Can you give me one good reason why RTX would fail, when it only needs a handful of gaming studios to adapt? And basically all games are closed-source, so it's not like they care about that aspect as much as you do. :-)
Posted on Reply
#79
Vayra86
ApocalypseeRTRT is still in very early stage of its life in gaming, this happened before during early days of pixel/vertex shading when Microsoft introduced it in DirectX. As long as there is a segregation in the product line (Ti-DX8 MX-DX7, RTX-accelerated RTRT GTX-no accelerated RTRT) this won't get proper traction. I still remember early days of pixel shading in game, the differences with it on and off is very little. Look at Metro Exodus below, can you show which one with RTRT on and off? This is IMO, THE BEST RTRT currently have in offer.



Well, to be fair on Metro's RT implementation, I don't expect much Global illumination indoors ;)

There are places in which it does add to the picture and atmosphere. But that's the problem; it feels like a trick as such. Not as something that's supposed to be there.
Posted on Reply
#80
R-T-B
notbAnd yet CUDA caught on
Sort of. It's got the performance lead on it's side and it's still only got like 50% of scientific applications.

Anyhow, I was speaking generally. There can and will be exceptions.
Posted on Reply
#81
notb
TotallyWe don't preorder games on promises of what's to come.
Why should that notion not apply to hardware also?
Funny argument. ;-)
Actually you do all the time. It's a game - you don't know if you'll like it. You may read a review and get a general idea whether it's good or not, but your subjective impression may differ.
Same goes for movies, books, food in restaurants etc.

With hardware you have the luxury of having benchmarks that tell you how it performs. And samples give you a very good idea of what to expect in image quality. So before buying you can fairly easily say if RTRT is worth the premium.

Adoption is a different matter. And here's the thing. If you're a casual gamer without brand preference and not involved in brand wars, by now you should be fairly convinced RTRT is going to be mainstream pretty soon. Nvidia can accelerate it. Next get consoles will. AMD will as well. Most big game studios either offer/develop it already or openly admit it's coming.

But among hardcore AMD fans (i.e. anti Nvidia) it's still a common theory that RTRT will fail - just because the other company started the revolution. It was strong initially, but it's still strong now - when we know that in 2-3 years every new gaming PC and console will offer hardware acceleration.
As for things being newsworthy or not, until devs start getting games with RTX out the door it doesn't matter who pledges to support it as there has been a large number for quite some time already but the number games that have shipped RTX can be counted on a single hand.
That's not true as well. Again: very biased opinion, but this time not because of your brand loyalty but general approach to gaming.
People on this forum tend to game a lot and play many titles. I expect you have a few dozen in your Steam library, right? :)

This is NOT how mainstream gaming looks. The whole gaming business is focused around a small set of very popular franchises: GTA, TES, Diablo, Warcraft, FIFA, CoD, Battlefield, Assassin's Creed, Tomb Raider, Fallout, The Witcher, Civilization, Gran Turismo - things like that.
Nvidia doesn't need to win all game developers. It's enough to get RTRT support in ~half of the top series. At that point almost every gamer will have at least one game that benefits from this tech - often the one he plays the most.

Next year consoles will start supporting RTRT so this will really become a standard. Nvidia's strategy assumes that RTRT will become a key selling point in GPUs and AMD's first implementation will be rushed and lacking. This will let them dominate sales for another 2-3 years - even if AMD catches up on performance and efficiency (thanks to Arcturus, 7nm EUV or divine intervention).
R-T-BSort of. It's got the performance lead on it's side and it's still only got like 50% of scientific applications.
Where did you get that "50%" from?
CUDA is the standard and it dominates GPGPU. There's basically nothing else. The only reason for people to use OpenCL in scientific / engineering GPGPU is when the program has to run on CPUs as well. And that's fairly rare.

Gamers underestimate CUDAs popularity - likely simply because they have no idea what they're talking about (why would they?).
Ask a scientist. You'll see I'm right. :-)
Posted on Reply
#82
Vayra86
notbFunny argument. ;-)
Actually you do all the time. It's a game - you don't know if you'll like it. You may read a review and get a general idea whether it's good or not, but your subjective impression may differ.
Same goes for movies, books, food in restaurants etc.

With hardware you have the luxury of having benchmarks that tell you how it performs. And samples give you a very good idea of what to expect in image quality. So before buying you can fairly easily say if RTRT is worth the premium.

Adoption is a different matter. And here's the thing. If you're a casual gamer without brand preference and not involved in brand wars, by now you should be fairly convinced RTRT is going to be mainstream pretty soon. Nvidia can accelerate it. Next get consoles will. AMD will as well. Most big game studios either offer/develop it already or openly admit it's coming.

But among hardcore AMD fans (i.e. anti Nvidia) it's still a common theory that RTRT will fail - just because the other company started the revolution. It was strong initially, but it's still strong now - when we know that in 2-3 years every new gaming PC and console will offer hardware acceleration.

That's not true as well. Again: very biased opinion, but this time not because of your brand loyalty but general approach to gaming.
People on this forum tend to game a lot and play many titles. I expect you have a few dozen in your Steam library, right? :)

This is NOT how mainstream gaming looks. The whole gaming business is focused around a small set of very popular franchises: GTA, TES, Diablo, Warcraft, FIFA, CoD, Battlefield, Assassin's Creed, Tomb Raider, Fallout, The Witcher, Civilization, Gran Turismo - things like that.
Nvidia doesn't need to win all game developers. It's enough to get RTRT support in ~half of the top series. At that point almost every gamer will have at least one game that benefits from this tech - often the one he plays the most.

Next year consoles will start supporting RTRT so this will really become a standard. Nvidia's strategy assumes that RTRT will become a key selling point in GPUs and AMD's first implementation will be rushed and lacking. This will let them dominate sales for another 2-3 years - even if AMD catches up on performance and efficiency (thanks to Arcturus, 7nm EUV or divine intervention).


Where did you get that "50%" from?
CUDA is the standard and it dominates GPGPU. There's basically nothing else. The only reason for people to use OpenCL in scientific / engineering GPGPU is when the program has to run on CPUs as well. And that's fairly rare.

Gamers underestimate CUDAs popularity - likely simply because they have no idea what they're talking about (why would they?).
Ask a scientist. You'll see I'm right. :)
Pretty soon and 'next gen consoles will accelerate' don't mix very well. As of now there is no traction whatsoever and the next gen will only just be the early beginning of the real RT push. Because now, everyone is able and ready to play along. But then games still need to be developed with RT from the ground up, and not as a lazy addon that you can just as well turn off. That is where the problem lies, and the time truly gets consumed the coming years. Devs will look for a balance, also in dev time, for raster and RT. And neither of those contribute to core gameplay ;) In the end you do still need a product that's viable beyond the graphics you see.

Something's gotta give in this balancing act and its not always going to be pretty. So you can safely say truly reliable RT that truly adds to the experience, is quite far away still.

I don't think people (me included, and I stand / stood corrected on that one) are saying RT is never going to happen. I think many people have a bit of a rosy outlook on how these technologies evolve. RT right now is a complete black box wrt perf/dollar and performance within the product stacks. Nobody has a clue what ten gigarays even means for their image quality. Even AMD hasn't said a single thing about what's to be expected, and Nvidia's implementations are all over the place, between tech demo and in-game. We also merely think a separate box of transistors and the whole denoising affair (there is still quite a bit of 'brute forcing' involved, more is rendered than we are seeing) is the best practice right now, but it really doesn't have to be. We've seen many technologies slimmed down or altered radically to make them 'fit' in the pipeline. And that pipeline is still raster-based. As long as that is the case, it will be a slow, uphill climb.
Posted on Reply
#83
Vya Domus
There is a good chance that the next generation of consoles actually wont support RT properly. I find it hard to believe there was a different iteration of Navi with RT cores being developed alongside with what we got on desktop. We have to keep in mind Sony and MS aren't screwing around and by now they likely have finalized hardware meaning whatever AMD made for them was also finalized at the very least before that. I believe there is simply no room for AMD to have made multiple designs of Navi during all this time.

Microsoft made very ambiguous claims about RT and the only thing Sony was explicit about is using ray-tracing for "sound". None of this sounds particularly exciting, if they were really RT capable this stuff would have been plastered all over but instead they are awfully quite and vague.
Posted on Reply
#84
notb
Vya DomusThere is a good chance that the next generation of consoles actually wont support RT properly. I find it hard to believe there was a different iteration of Navi with RT cores being developed alongside with what we got on desktop. We have to keep in mind Sony and MS aren't screwing around and by now they likely have finalized hardware meaning whatever AMD made for them was also finalized at the very least before that. I believe there is simply no room for AMD to have made multiple designs of Navi during all this time.
Sony and MS can develop their own RT chips. It doesn't have to be put on the GPU die. They surely knew how advanced Nvidia's development was (most like Nvidia simply told them during negotiations).
They had the time, the money and the R&D potential to develop an in-house solution (likely based on FPGA).

It's really not that hard. Student/scientists design simple RT FPGA accelerators for research. You can even download ready to go solutions, like this one:
github.com/justingallagher/fpga-trace
Microsoft made very ambiguous claims about RT and the only thing Sony was explicit about is using ray-tracing for "sound". None of this sounds particularly exciting, if they were really RT capable this stuff would have been plastered all over but instead they are awfully quite and vague.
Why would they?
Most tech companies are fairly reserved when it comes to future products. RTX was a surprise as well. We knew Nvidia is working on it, but nothing else.
In fact: what do we really know about PS5 (expected in under 12 months)? It'll use AMD. It'll game in 4K and maybe scale to 8K. Anything else?

IMO there's a very good reason for this: console buyers don't give a f... They just want to play games. They don't even care about fps as long as the game looks smooth. And they surely understand ray tracing even less than PC gamers.
Posted on Reply
#85
Vayra86
notbknow about PS5 (expected in under 12 months)? It'll use AMD. It'll game in 4K and maybe scale to 8K. Anything else?

IMO there's a very good reason for this: console buyers don't give a f... They just want to play games. They don't even care about fps as long as the game looks smooth. And they surely understand ray tracing even less than PC gamers.
And yet, this exists


Shiny rays will definitely sell and it would be much more of a selling point for young gamers than 4K or 8K. That res relies on the telly mom and dad / kid has in the living room. Or upstairs - which is even less likely.

Last console gen was a real battle comparing the actual specs, and its the reason the PS4 came out swinging right from the start - it was faster. It played BF in a higher res than the X1. Etc.
notbSony and MS can develop their own RT chips.
Yes, and I can write my own OS. Now you're just grasping at straws, because neither company has even said the slightest about even planning to do chip design for either console. That's the whole point they work with AMD, the company can make them a custom SoC that has everything they need.

Navi's RT for consoles will be an afterthought, just as the timing of its announcement.
Posted on Reply
#86
londiste
notbSony and MS can develop their own RT chips. It doesn't have to be put on the GPU die.
Not likely. What Nvidia does and similarly AMD's patent shows is if RT units are to be effective (primarily in terms of timely results) there is heavy reliance on same data as shaders (more cache etc). Separate chip for RT calculations is trivial, feeding the data into that separate chip is not. This remains the case even with the RT chip on the same package and separate chip solution would also introduce area and power inefficiency due to communication.

Completely different solution can be implemented but given that performance is not sufficient for full-RT and hybrid solutions will still need integration into rasterization pipeline there is a considerable technical challenge here, both in hardware and in software. Nvidia opened this up with what seems to be a sensible starting solution, this approach is standardized as DXR (notably - a Microsoft thing) and creating a custom alternative for the sake of it does not sound like the best of ideas. At least for Microsoft, Sony consoles are more into proprietary APIs.
Posted on Reply
#87
jabbadap
notbSony and MS can develop their own RT chips. It doesn't have to be put on the GPU die. They surely knew how advanced Nvidia's development was (most like Nvidia simply told them during negotiations).
They had the time, the money and the R&D potential to develop an in-house solution (likely based on FPGA).

It's really not that hard. Student/scientists design simple RT FPGA accelerators for research. You can even download ready to go solutions, like this one:
github.com/justingallagher/fpga-trace

Why would they?
Most tech companies are fairly reserved when it comes to future products. RTX was a surprise as well. We knew Nvidia is working on it, but nothing else.
In fact: what do we really know about PS5 (expected in under 12 months)? It'll use AMD. It'll game in 4K and maybe scale to 8K. Anything else?

IMO there's a very good reason for this: console buyers don't give a f... They just want to play games. They don't even care about fps as long as the game looks smooth. And they surely understand ray tracing even less than PC gamers.
Interesting though, they could even buy hw implementation from imgtec... Albeit I'm quite sure that rdna2 will include RT hardware implementation(source). And upcoming consoles will most likely use that. It will be hybrid RT anyway from now on, Consoles won't be doing any sort of full path racing in near future.
Posted on Reply
#88
bug
And now we have RTRT coming to Minecraft. That's quite an install base, isn't it?
Plus, blocky graphics + realistic lighting = :love:
Posted on Reply
#89
Vayra86
bugAnd now we have RTRT coming to Minecraft. That's quite an install base, isn't it?
Plus, blocky graphics + realistic lighting = :love:
Imagine if Nvidia managed to get a foot in the door at Fortnite. But Minecraft is a good start, its the type of game for it too; lots of staring at the same world.
Posted on Reply
#90
bug
Vayra86Imagine if Nvidia managed to get a foot in the door at Fortnite. But Minecraft is a good start, its the type of game for it too; lots of staring at the same world.
Well, when Unreal Engine picks up RTRT, then we'll be talking install base.
However, support is one thing, doing worthwhile things with it is another. Minecraft looks brilliant to me, because it manages to add some realism into a world that's made out of blocks. I can't put my finger on it, but the whole thing just works. At least for me. And that's what's going to make RTRT take off, not just the ability to run RTRT at a constant 60fps.
Posted on Reply
Add your own comment
Mar 6th, 2025 06:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts