Wednesday, November 14th 2018

Battlefield V with RTX Initial Tests: Performance Halved
Having survived an excruciatingly slow patch update, we are testing "Battlefield V" with DirectX Ray-tracing and NVIDIA RTX enabled, across the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti, augmenting the RTX-on test data to our Battlefield V Performance Analysis article. We began testing with a GeForce RTX 2080 Ti graphics card with GeForce 416.94 WHQL drivers on Windows 10 1809. Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.
These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz. But all is not lost. DICE added granularity to RTX. You can toggle between off, low, medium, high, and ultra as "degrees" of RTX level of detail, under the "DXR ray-traced reflections quality" setting. We are currently working on 27 new data-points (each of the RTX 20-series graphics cards, at each level of RTX, and at each of the three resolutions we tested at).
Update: Our full performance analysis article is live now, including results for RTX 2070, 2080, 2080 Ti, each at RTX off/low/medium/high/ultra.
These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz. But all is not lost. DICE added granularity to RTX. You can toggle between off, low, medium, high, and ultra as "degrees" of RTX level of detail, under the "DXR ray-traced reflections quality" setting. We are currently working on 27 new data-points (each of the RTX 20-series graphics cards, at each level of RTX, and at each of the three resolutions we tested at).
Update: Our full performance analysis article is live now, including results for RTX 2070, 2080, 2080 Ti, each at RTX off/low/medium/high/ultra.
180 Comments on Battlefield V with RTX Initial Tests: Performance Halved
Turings are not that bad of a value by itself. Except for 2080Ti which is flagship money, Turings are a bit better perf/$ than Pascals. I know everyone expected the usual thing of getting previous high end performance for midrange money but isn't it time to get over it already?
Especially since pretty much everyone with constant whining is not running Turing cards anyway.
/rant
2080Ti is most advanced, it has RTRT capability.
Everyone knew DXR comes with a considerable performance hit from day 1.
All the threads that are even remotely related to RTX - or even any GPU lately - will devolve into this useless crap these days.
Edit:
A larger problem is that DXR is standardized in DX12. There are very few DX12 games to begin with. Battlefield series has its own set of problems with DX12 with microstutters and it bing slow across the board. Shadow of Tomb Raider might be more interesting, its DX12 implementation is one of the best (next to Sniper Elite 4 and possibly Hitman) and it is a single-player game where eyecandy could be justifiable.
;)
/s
the gtx 1080Ti was released with the same msrp or close to as the gtx 980ti and offered some 60% better performance
TitanV is like Quadro, not meant for gaming
People keep forgetting that the gtx 1080ti was launched at 699 usd the same as the gtx 980ti while offering some 60% better performance
the rtx 2080ti only offers 30% improvement while launched with a msrp 85% higher than its predecessor
btw what about future, lets say 1-2 years forward, next release - rtx 3080 ti lets say another +30% better than rtx 2080 ti, and price 2000$, but who cares it is 30% better - so a good product period and we should be happy about it? I would, because I would walk out of store for 0,00 with that beast :D
Price is a primary concern despite all the marketing and hype we let ourselves get fooled with. An enthusiast forum is the worst possible place to gauge the sentiment with regards to price.
Ironically an enthusiast forum is also the only place you will find apologists for this horrible business practice Nvidia is employing with Turing. We're paying a high price for RTX and if we want to upgrade, we literally have no way to avoid it. And then people here happily say 'look, theý're selling cards, its going well!'... :roll: Typical situation of 'in the land of the blind'.
I've said this before, and I will say it again: you have a real option of NOT BUYING something. Skip a gen, and you will find RTX to be far more reasonably priced either this gen or the next one. Thát is my main motvation for resisting this, and believe it or not, that sentiment does matter, even if some fools buy it anyway. Another possible advantage is that more effort will be put into optimization and quality of the experience, because that is another way to increase value.
If you really want RTRT to happen, now is the time to step on the brakes hard and force an adjustment. Turing's implementation is simply not viable.
My takeaway from the performance numbers is that it's very clear RT simply isn't ready for prime-time use yet. For me 2k and 60fps for PC gaming is a minimum acceptable threshold, and I'd bet the majority of PC gamers would say the same or even a higher minimum. And looking at the numbers the only 20xx card that can even do above 60fps at 2k with RT on is the 2080ti, and even that is only with RT set to low. ANY other scenario at 2k is below 60fps with RT enabled. Basically, this is a repeat of NV pushing 4k gaming a few years ago with the 980ti's when the tech really wasn't there yet, so people spent lots of money only to find out that 4k gaming then meant about 20fps which wasn't practical at all. And at 4k with RT even on low the 2080ti can't crack 50fps. Cool tech, but it's definitely a generation or maybe even 2 away from being work considering.
Edit: Not judging, just curious as what would drive $6k into a system mostly used for driving to and from work.
So the prices are still ridiculous over 500
While the RTX effects are impressive, to me it looks more like a tech demo. NVIDIA will have to considerably increase the RT core count (like double or triple), or the technique has to get optimized a lot before this will become a real game changer. Also as a buyer of an RTX 2070 (which I am not), I would be rather disappointed, since those meager 36 RT cores make DXR almost useless.
Still I am excited to see what other upcoming games or DXR patches will bring to the table.
(Please ignore this post, if all of this has already been discussed before, I did not read all 7 pages before posting)
www.techpowerup.com/reviews/AMD/R9_295_X2/
And as that was basically crossfire on a stick, poor or lacking crossfire support meant the card couldn't shine in every title either.
Just some perspective for the current prices.
Until people make a decision of when too much is too much, prices will continue to rise.
The optimization has to come in making those RT cores better and adding more of them. Just adding more RT cores right now will take up too much die space for not much benefit like the scaling from 2070 to 2080 TI. They need to be multiple times better x10+ so they wont be bogged down if games use more then one RTX effect.