# Battlefield V with RTX Initial Tests: Performance Halved



## btarunr (Nov 14, 2018)

Having survived an excruciatingly slow patch update, we are testing "Battlefield V" with DirectX Ray-tracing and NVIDIA RTX enabled, across the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti, augmenting the RTX-on test data to our Battlefield V Performance Analysis article. We began testing with a GeForce RTX 2080 Ti graphics card with GeForce 416.94 WHQL drivers on Windows 10 1809. Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.

These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz. But all is not lost. DICE added granularity to RTX. You can toggle between off, low, medium, high, and ultra as "degrees" of RTX level of detail, under the "DXR ray-traced reflections quality" setting. We are currently working on 27 new data-points (each of the RTX 20-series graphics cards, at each level of RTX, and at each of the three resolutions we tested at). 

*Update*: Our full performance analysis article is live now, including results for RTX 2070, 2080, 2080 Ti, each at RTX off/low/medium/high/ultra.



 

 

 



*View at TechPowerUp Main Site*


----------



## INSTG8R (Nov 14, 2018)

PhysX 2.0...


----------



## Vayra86 (Nov 14, 2018)

INSTG8R said:


> PhysX 2.0...



Actually I'm more inclined to say this is Hairworks 2.0. Barely visible improvements at major performance hit, but don't worry, you can make it super low quality to have decent performance. PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.


----------



## INSTG8R (Nov 14, 2018)

Vayra86 said:


> Actually I'm more inclined to say this is Hairworks 2.0. Barely visible improvements at major performance hit, but don't worry, you can make it super low quality to have decent performance. PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.


Eventually. 2nd generations is where it became viable it will be the same case here. Out of the gate it killed everything.


----------



## Hellfire (Nov 14, 2018)

Vayra86 said:


> PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.





Omg the initial PhysX launch and performance issues was very noticeable....


----------



## londiste (Nov 14, 2018)

> Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.
> These may look horrifying, given that at its highest setting, even an RTX 2080 Ti isn't able to manage 1080p 120 Hz.


Horrifying?
So you mean it does 80+FPS at 1080p with DXR at highest possible setting?
Assuming the same (less than) 50% hit, it'll do 60+FPS at 1440p, and 40+FPS at UHD 4K.


----------



## btarunr (Nov 14, 2018)

Update: we are experiencing a lot of system instability, including CTDs, screen blackouts (requiring reboots), etc. None of our cards are overclocked beyond whatever factory OC.


----------



## Vayra86 (Nov 14, 2018)

Hellfire said:


> Omg the initial PhysX launch and performance issues was very noticeable....



I guess I clearly missed out on that 

Apparently not just TPU is having issues.

https://www.pcgamesn.com/battlefield-5-ray-tracing-dx12-performance


----------



## INSTG8R (Nov 14, 2018)

Vayra86 said:


> I guess I clearly missed out on that


Well if you remember people used to buy low end cards and used them just for dedicated PhysX


----------



## mnemo_05 (Nov 14, 2018)

b..b-b-bu..but.. but.. muh gigareeeeeeeeeyzzz!!! cool rays bro..

on a serious note, this real-time raytracing shenanigans is pretty much DOA right from the start. ever wonder why no gaming performance was included on the rtx card announcement? now you know..


----------



## INSTG8R (Nov 14, 2018)

btarunr said:


> Update: we are experiencing a lot of system instability, including CTDs, screen blackouts (requiring reboots), etc. None of our cards are overclocked beyond whatever factory OC.


Well that doesn’t bode well as a product showcase.


----------



## Vayra86 (Nov 14, 2018)

INSTG8R said:


> Well if you remember people used to buy low end cards and used them just for dedicated PhysX



Yes but I recalled that as something mostly AMD users did. Not sure why. Anyway...


----------



## btarunr (Nov 14, 2018)

Vayra86 said:


> Apparently not just TPU is having issues.



All my contacts who are awake and testing report tremendous instability. It's almost unplayable and stutters terribly. At some point you're no longer playing a game but watching a Powerpoint presentation.


----------



## Hellfire (Nov 14, 2018)

But yeah, Seriously, doesn't bode well,


----------



## medi01 (Nov 14, 2018)

Well, not far from 120Hz ain't bad, although I'd expect to have resolution higher than 1080p having paid $1.3k+ for a GPU.



Vayra86 said:


> PhysX on GPU has almost always been pretty flawless


In a green galaxy far far away.


----------



## FreedomEclipse (Nov 14, 2018)

INSTG8R said:


> Well that doesn’t bode well as a product showcase.



Well EA aren't really known for bug free day 0 releases.. Maybe they used to some 10-15 years ago but not now.... Having an attack boat spawn inland on the Shanghai BF3 map still gives me PTSD And oohhhh the glitches... 
	

	
	
		
		

		
		
	


	








Dont worry... We only need to wait another 8 months or so for Battle Royale mode to calm the pitchfork wielding crowds. BR will make everyone forget about RTX  not working in no time


----------



## stimpy88 (Nov 14, 2018)

I love to say it...   I told you so!

nGreedia...  The way you're meant to be played! Fools.

Hope you 2080Ti owners enjoy the 1080p 30 to 60FPS gaming on your shiny 4K 144Hz monitors!  And what about the experts here saying what value for money the 2070 is?  Yeah, real value, right there!  An 2070 at 15-35FPS 1080p sounds great for a $600+ “value”.

Can't wait to hear how wrong I am!  nGreedia will be sending the software update out to the green NPC army right about now, so that should make this fun.


----------



## INSTG8R (Nov 14, 2018)

Vayra86 said:


> Yes but I recalled that as something mostly AMD users did. Not sure why. Anyway...


Nah Man we couldn’t even do that. I mean seeing as NV is still using the same ancient CP you can still select what runs PhysX even now


----------



## Tomgang (Nov 14, 2018)

With physx you cut at least have a second slower dedikated card for physx so that the performance dit not drop.

Wunder if that will become a thing with ray-tracing as it gets implemented over time. So far i am glad that i dit not wait for rtx card. Im happy with my gtx 1080 ti.


----------



## Tsukiyomi91 (Nov 14, 2018)

not a surprise considering this is partially on EA's fault for setting the RT bar too high... on top of being an early public release.


----------



## Stry (Nov 14, 2018)

If its DXR, is it not the DirectX 12 implementation, and therefore would be platform agnostic? Are there drivers that are pushing nVidia's RTX Cores to utilize this? I mean, we know AMD can do this, hence Radeon Rays, but DXR is not RTX. I'm sure RTX can do DXR, but if performance is halved, does that mean its not using the actual hardware for it? Like how nVidia had to emulate Asynchronous Compute Shaders? I mean, correct me if I'm wrong, I didn't design any of this, so its just my understanding, and if anyone has any more information I'd be happy to add it to the pile.


----------



## INSTG8R (Nov 14, 2018)

Tsukiyomi91 said:


> not a surprise considering this is partially on EA's fault for setting the RT bar too high... on top of being an early public release.


Yeah I mean EVERYBODY is an early adopter at this point including DICE.


----------



## PerfectWave (Nov 14, 2018)

the most failure in recent years LUL!


----------



## Prima.Vera (Nov 14, 2018)

1300 €  card to play games in 5-->50fps @ 1080p... Awesome fking job nGreedia.


----------



## L33t (Nov 14, 2018)

@btarunr

Please check if Explicit Multi-GPU is supported and if so.. Will RTX work?

BF1 had Explicit Multi-GPU added in the CTE update, and it worked better than SLI/DX11.
1.


----------



## trog100 (Nov 14, 2018)

this will put a few smiles on the faces of those who opted for a new 1080ti or those who already own one.. he he

trog


----------



## LightningJR (Nov 14, 2018)

I wont knock Nvidia for some brand new tech being not perfect. There's a lot to hate about Nvidia but new tech is not one of them. Push the envelope Nvidia, never stop.

Please lower your prices............ we all can't afford 1k GPUs.

and all the other BS you do........


----------



## btarunr (Nov 14, 2018)

Added a teaser for our data. We're writing a new review-article on RTX performance impact. Nightmare.


----------



## Ravenmaster (Nov 14, 2018)

stimpy88 said:


> I love to say it...   I told you so!
> 
> nGreedia...  The way you're meant to be played! Fools.
> 
> ...



No one is denying RTX is bad but what does AMD have? A 2080ti owner switches off this RTX crap and suddenly it’s back up to 144fps+ @ 1440p or 60-100fps @ 4K on ultra. AMD can’t get anywhere near that right now, which is where the real problem lies. Because if AMD could match that performance then nGreedia would have to lower their prices to something more reasonable.


----------



## rgravine (Nov 14, 2018)

How do I enable RTX as I have the latest Nvidia Drivers, Windows Update, and Battlefield V Update. Running 2 2080s in with NVLINK. I Have restarted several times and do not see the options like your screenshots.


----------



## M2B (Nov 14, 2018)

Just look at these butthurt AMD fanboys.
I thought TechPowerUp community was better than this, guess I was wrong.

The current hardware is just way too slow for RTRT + we obviously have some terrible implementation here.
Ray Tracing is not gonna be a game changer anytime soon.


----------



## L33t (Nov 14, 2018)

rgravine said:


> How do I enable RTX as I have the latest Nvidia Drivers, Windows Update, and Battlefield V Update. Running 2 2080s in with NVLINK. I Have restarted several times and do not see the options like your screenshots.



Try disabling SLI. I have a feeling RTX is not working with SLI. Specially since DX12 never worked with SLI and DX12 is a requirement for RTX.

Dice will have to add support for DX12 Explicit Multi-GPU..


----------



## rgravine (Nov 14, 2018)

L33t said:


> Try disabling SLI. I have a feeling RTX is not working with SLI. Specially since DX12 never worked with SLI in this game, and is a requirement for RTX.
> 
> Dice will have to add support for DX12 Explicit Multi-GPU..


I have disabled it and still no DXR is BFV options. Im sure Im missing something somewhere.


----------



## Ravenmaster (Nov 14, 2018)

PerfectWave said:


> the most failure in recent years LUL!



Lol dude you sound like Ralph Wiggum. 






I think you meant 'The *biggest *failure in most recent years.'


----------



## [XC] Oj101 (Nov 14, 2018)

The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.

When the Voodoo was released was there widespread support for Glide? No. When the GeForce 256 was released was there widespread support for hardware T&L? No. When the GeForce 3 came out was there widespread support for programmable shaders? No. Why do you expect any different from this?

So far we do not have any games using an engine built from the ground up for ray tracing support. Ray tracing has been patched onto rasterization engines which were never designed for ray tracing.

Software needs to catch up to the hardware. Look at any launch title on console and compare the graphics to something a few years down the line. The hardware didn’t change, but the software caught up.

A friend of mine gave me a brilliant analogy last night. When a baby takes its first steps do the parents say “oh that’s crap, he’s so slow and unstable” or are the blown away that their little one is making such good progress?

The same will be true of ray tracing. For the first time we are looking at image generation in game differently and progress can only go one way. Wait until we have engines written with ray tracing in mind from the get go.

If you wanted all-wheel ABS in 1978, Mercedes offered it for around $ 32,000 BACK THEN. That was your only option. Now it’s found on almost every entry level car regardless of price. Give ray tracing a bit of time to mature. There will never be a “right time” for the initial release as without the initial release there will be no further progress.

Appreciate the technology for what it is and the revolutionary (as opposed to evolutionary) change it can bring.

So what if the RTX 2080 Ti gets 400 FPS instead of 180 FPS using traditional render methods? Both are beyond the level of perception of the human eye so it becomes an arbitrary figure. What we need is a way to drastically increase image quality without the performance hit we would have had prior to the RTX cards, which is now a reality. Don’t base the small increase in quality on a badly patched rasterization engine.

We now have the computational power that until not very long ago required a render farm packaged into a single GPU with a price tag that high end enthusiasts can afford. Show me one other graphics card that offers that?


----------



## londiste (Nov 14, 2018)

rgravine said:


> I have disabled it and still no DXR is BFV options. Im sure Im missing something somewhere.


Are you running BF5 with DX12?


----------



## rgravine (Nov 14, 2018)

londiste said:


> Are you running BF5 with DX12?


Yes


----------



## Valantar (Nov 14, 2018)

@btarunr An important question here: what are core clocks like while having RTX enabled? Given that the cards all max out their power budgets running standard rendering loads, I'm very interested in seeing just how power hungry the RT cores are and how much of the performance drop is due to this. If the RT cores pull 100W, it's no wonder that the performance drops dramatically ...


----------



## HD64G (Nov 14, 2018)

Based on the test reluts for 1080P Ultra in BF5, RTX reduces the FPS to the 40% of the non-RTX performance. We are talking about a 60% reduction there. And until it gets stable and playable we will probably need many patches of the game and updates of the nVidia drivers.


----------



## rgravine (Nov 14, 2018)

HD64G said:


> Based on the test reluts for 1080P Ultra in BF5, RTX reduces the FPS to the 40% of the non-RTX performance. We are talking about a 60% reduction there. And until it gets stable and playable we will probably need many patches of the game and updates of the nVidia drivers.


Why am I not able to see the DXR and RTX options in Battlefield? I have all the latests updates and 2 2080s


----------



## Hellfire (Nov 14, 2018)

rgravine said:


> Why am I not able to see the DXR and RTX options in Battlefield? I have all the latests updates and 2 2080s



Rather than hijacking this topic? would it not be better to create your own to get specific support?


----------



## HD64G (Nov 14, 2018)

rgravine said:


> Why am I not able to see the DXR and RTX options in Battlefield? I have all the latests updates and 2 2080s


I would verify the game files just to be sure.


----------



## rgravine (Nov 14, 2018)

Hellfire said:


> Rather than hijacking this topic? would it not be better to create your own to get specific support?


This is related to RTX is it not?


----------



## INSTG8R (Nov 14, 2018)

[XC] Oj101 said:


> The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.
> 
> When the Voodoo was released was there widespread support for Glide? No. When the GeForce 256 was released was there widespread support for hardware T&L? No. When the GeForce 3 came out was there widespread support for programmable shaders? No. Why do you expect any different from this?
> 
> ...


While I agree with your “evolution” this is not a software thing but a Hardware thing. RTRT until now In the professional setting is done by render farms because of the sheer complexity. This is the first time it’s in the consumer space and it’s the bare minimum of RT coupled with the same bare minimum hardware to get it done. While the technique can be refined it still needs the hardware to leverage it. We can clearly see that we are not there yet. I would expect it to take a generation of hardware at least before it becomes viable for the masses.


----------



## Hellfire (Nov 14, 2018)

rgravine said:


> This is related to RTX is it not?



But it's a discussion topic, in the news section, Support section is That way.... ---->


----------



## mouacyk (Nov 14, 2018)

haha


----------



## ShurikN (Nov 14, 2018)

[XC] Oj101 said:


> Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.


Unplayable stuttering and 50% FPS drop on a $1200 card. Truly a technological marvel.


----------



## SIGSEGV (Nov 14, 2018)

haha....
oh my oh my..
oh look, it turns out to be very optimized for nvidia rtx lol

/pathetic


----------



## FreedomEclipse (Nov 14, 2018)

trog100 said:


> this will put a few smiles on the faces of those who opted for a new 1080ti or those who already own one.. he he
> 
> trog



you mean its time to void your 2080ti purchase by switching back to your 1080Ti or switching RTX off??


----------



## Aldain (Nov 14, 2018)

[XC] Oj101 said:


> The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.
> 
> When the Voodoo was released was there widespread support for Glide? No. When the GeForce 256 was released was there widespread support for hardware T&L? No. When the GeForce 3 came out was there widespread support for programmable shaders? No. Why do you expect any different from this?
> 
> ...



So much F*** damage control in one post it is unbelievable. Dude give it a rest , ray tracing is shat in bf5 and Turing is nothing more than a half baked beta testing arch with ONE ELEMENT ray tracing.. Calling Turing a revolution is an insult to the entire PC gaming and tech spectrum..


----------



## ikeke (Nov 14, 2018)

btarunr said:


> Added a teaser for our data. We're writing a new review-article on RTX performance impact. Nightmare.



Could the RT cores be memory limited? They seem to be from 1080p teaser image..


----------



## Shatun_Bear (Nov 14, 2018)

What did the Tom's Hardware editor say again? 

'Just buy one, why wait to experience ray tracing when you can have it now before all your friends'. Presumably he was talking about barely noticeable ray tracing 1080p gaming at 30 fps....


----------



## [XC] Oj101 (Nov 14, 2018)

ShurikN said:


> Unplayable stuttering and 50% FPS drop on a $1200 card. Truly a technological marvel.



Maybe you should read my full post before replying, such as:



> So far we do not have any games using an engine built from the ground up for ray tracing support. Ray tracing has been patched onto rasterization engines which were never designed for ray tracing.





> Wait until we have engines written with ray tracing in mind from the get go.





> Don’t base the small increase in quality on a badly patched rasterization engine.





Aldain said:


> So much F*** damage control in one post it is unbelievable. Dude give it a rest , ray tracing is shat in bf5 and Turing is nothing more than a half baked beta testing arch with ONE ELEMENT ray tracing.. Calling Turing a revolution is an insult to the entire PC gaming and tech spectrum..



You too are judging an entire technology based on one game which was not designed for ray tracing.

Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.


----------



## Hellfire (Nov 14, 2018)

Shatun_Bear said:


> What did the Tom's Hardware editor say again?
> 
> 'Just buy one, why wait to experience ray tracing when you can have it now before all your friends'. Presumably he was talking about barely noticeable ray tracing 1080p gaming at 30 fps....



People still read the trash from Toms Hardware?


----------



## RH92 (Nov 14, 2018)

[XC] Oj101 said:


> The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.



I would like to be surprised by the negativity surrounding RTX but the truth is 99% of those peoples  are simply ignorants who ignore what this technology represents  or simply find whatever reason to hate on Nvidia cause it's trending , so yeah im that surprised after all ! 

Real time raytracing  is so computational expensive  that everyone with half a brain cell will understand that simply the fact Nvidia has managed to put  the computational power of a render farm in a single GPU is already a HUGE achievement !  

Everyone raising the graphical quality in a given game won't be surprised to see his FPS drop  so why are peoples surprised when this happens with RTX  considering we are talking about a massive leap forward in graphical fidelity  ?  This is a revolution and as with every revolution you have to give time to things to settle down and mature . Things will only get better !


----------



## londiste (Nov 14, 2018)

RH92 said:


> Everyone raising the graphical quality in a given game won't be surprised to see his FPS drop  so why are peoples surprised when this happens with RTX  considering we are talking about a massive leap forward in graphical fidelity  ?


@btarunr, @W1zzard, please include screenshots of different DXR settings highlighting (or at least trying to highlight) differences in the eventual article about DXR and its performance.


----------



## INSTG8R (Nov 14, 2018)

RH92 said:


> I would like to be surprised by the negativity surrounding RTX but the truth is 99% of those peoples  are simply ignorants who ignore what this technology represents  or simply find whatever reason to hate on Nvidia cause it's trending , so yeah im that surprised after all !
> 
> Real time raytracing  is so computational expensive  that everyone with half a brain cell will understand that simply the fact Nvidia has managed to put  the computational power of a render farm in a single GPU is already a HUGE achievement !
> 
> Everyone raising the graphical quality in a given game won't be surprised to see his FPS drop  so why are peoples surprised when this happens with RTX  considering we are talking about a massive leap forward in graphical fidelity  ?  This is a revolution and as with every revolution you have to give time to things to settle down and mature . Things will only get better !


When it turns a 4K capable card into a barely 1080p card to use it. It’s not really impressive and shows it’s not ready for prime time.


----------



## Vya Domus (Nov 14, 2018)

[XC] Oj101 said:


> Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.



Calling the work thier engineers have put in this useless would indeed be an insult.

Calling out the uselessness of the technology in our current context would be however a perfectly reasonable thing to do.


----------



## Hellfire (Nov 14, 2018)

Jesus, with how some of ya'll are reacting you'd think the US was succeeding from the UK again.


----------



## Frick (Nov 14, 2018)

I thought it was understood RTX woulf make for horrible performance?


----------



## RH92 (Nov 14, 2018)

londiste said:


> @btarunr, @W1zzard, please include screenshots of different DXR settings highlighting (or at least trying to highlight) differences in the eventual article about DXR and its performance.



Im talking about RTX as a technology . Showing screenshots of BF5  will only tell you how well  ( or badly )   DICE implemented this technology in their game .


----------



## [XC] Oj101 (Nov 14, 2018)

INSTG8R said:


> When it turns a 4K capable card into a barely 1080p card to use it. It’s not really impressive and shows it’s not ready for prime time.





> There will never be a “right time” for the initial release as without the initial release there will be no further progress.





Vya Domus said:


> Calling out the uselessness of the technology in our current context would be however a perfectly reasonable thing to do.



There is no current context as we do not have a SINGLE native ray tracing engine.

I really feel nobody replying read my *full* post with the way I'm repeating myself here.


----------



## Aldain (Nov 14, 2018)

[XC] Oj101 said:


> Maybe you should read my full post before replying, such as:
> 
> 
> 
> ...



First of Ray Tracing is not an NVIDIA THING , ray tracing has been around for years now , second Turing as an ARCH is nothing more than a beta test , a very expensive beta test if you can get it not to DIE after a week or two hours depending on your luck , thirdly stop calling it RAY TRACING it is ONE ELEMENT of it which BTW was DOWNGRADED by DICE to get this SHAT performance in the first place. A DOWNGRADED ONE ELEMENT of Ray Trayinc turned a 4k/60 card into a 1080p /60 one.. (2080 and 2070 are DEAD FOR RAY TRACING) Enough with the apologetic behavior for Turing , it is what it is... Shat!


----------



## londiste (Nov 14, 2018)

Aldain said:


> which BTW was DOWNGRADED by DICE to get this SHAT performance in the first place. A DOWNGRADED ONE ELEMENT of Ray Trayinc turned a 4k/60 card into a 1080p /60 one.. (2080 and 2070 are DEAD FOR RAY TRACING) Enough with the apologetic behavior for Turing , it is what it is... Shat!


Downgraded? Their tech demo ran reflections at 4 times the resolution of the previous rasterized implementation.


----------



## INSTG8R (Nov 14, 2018)

[XC] Oj101 said:


> There is no current context as we do not have a SINGLE native ray tracing engine.
> 
> I really feel nobody replying read my *full* post with the way I'm repeating myself here.


But you’re talking in what if’s and possibles. We are seeing it in action and it wasn’t like any of the previous showcases/demos were any different. The sacrifice in performance is far greater than the benefits right now.When the most powerful solution available can’t handle the most simple form of RT we can really call it a success can we?


----------



## FreedomEclipse (Nov 14, 2018)

Aldain said:


> Calling Turing a revolution is an insult to the entire PC gaming and tech spectrum..



Yes and no.... To have a revolution you have to start one. but revolutions can be crushed.


You cant snap your fingers and instantly have everything the way you want it to be. Change within the industry takes time and that has been proven so many times by developers who have not even bothered making new games for newer DX versions....

you must have a person or a group that desires change and push the initiative. Otherwise whats the point of technology if its not going to be moving forward?? we would still be going about our business on horse drawn carts and people running up and down streets delivering telegrams then be where we are today.

But once the movement gains traction and the industry starts to change, only then will we see the benefits of such technology....


And its not as if Nvidia is *forcing *anyone to buy an RTX card you have a choice. you can either support them by buying one or grab one of their older cards and sit on your hands till the next generation is ready and the industry has moved to adopt RTX features further.

I opted for a pre-owned 1080Ti. Nvidia arent going to send an elite hit squad after me because of the decision I made. I think the price of the 2080 and anything above it is ludicrous because Nvidia are trying to sell people a feature that doesnt actually exist yet and even it does exist in what little form it may take, it  runs terribly.


Nvidia can put all its money behind RTX and try to get the industry to adopt but if they say no then theres not a lot they can do to influence change but by maybe buying or creative their own studio.


Nobody can predict if RTX will be a big thing in 5 years time. But there is no smoke without fire...

so shut up, sit down, strap in and put your helmet on.


----------



## Vya Domus (Nov 14, 2018)

Ravenmaster said:


> AMD can’t get anywhere near that right now, which is where the real problem lies.



Nice way of trying to turn this into an AMD issue.

Everyone get your pitch and fork, it's because of that damn AMD you get to enjoy a cinematic sub 60 fps experience.



[XC] Oj101 said:


> There is no current context as we do not have a SINGLE native ray tracing engine.



Poor performance doesn't need much context. It's poor performance.


----------



## Frick (Nov 14, 2018)

Hellfire said:


> Jesus, with how some of ya'll are reacting you'd think the US was succeeding from the UK again.



THEY DID WHAT


----------



## btarunr (Nov 14, 2018)

Close to finishing the article. Much of the instability is gone now. When it does work, RTX is magic. Look at the screenshot we added.


----------



## londiste (Nov 14, 2018)

btarunr said:


> Much of the instability is gone now.


Any thoughts on what caused the instability?
Not that Battlefield V is the most stable thing in DX12 anyway


----------



## [XC] Oj101 (Nov 14, 2018)

Aldain said:


> First of Ray Tracing is not an NVIDIA THING , ray tracing has been around for years now , second Turing as an ARCH is nothing more than a beta test , a very expensive beta test if you can get it not to DIE after a week or two hours depending on your luck , thirdly stop calling it RAY TRACING it is ONE ELEMENT of it which BTW was DOWNGRADED by DICE to get this SHAT performance in the first place. A DOWNGRADED ONE ELEMENT of Ray Trayinc turned a 4k/60 card into a 1080p /60 one.. (2080 and 2070 are DEAD FOR RAY TRACING) Enough with the apologetic behavior for Turing , it is what it is... Shat!



I never said ray tracing is an NVIDIA thing. I've been using it for more than a decade, but not with gaming. Real time ray tracing as presented by RTX is very much an NVIDIA thing, however.

Once again, we don't have an engine written with native support for ray tracing so stop calling something ---- if we don't even have a semi-decent way of judging it. As a technology it is absolutely incredible, we just need implementation.


----------



## Vya Domus (Nov 14, 2018)

btarunr said:


> When it does work, RTX is magic. Look at the screenshot we added.



To be honest had I not known that is ray traced, I would have been led to believe it's a simple screen space effect.


----------



## RH92 (Nov 14, 2018)

INSTG8R said:


> When it turns a 4K capable card into a barely 1080p card to use it. It’s not really impressive and shows it’s not ready for prime time.



Fist of all to make this conclusion you have to compare the graphical quality of 1080p with RTX enabled and 4K without RTX enabled . As far as i know we don't have material to make such a comparison so you can't make such a conclusion . Don't get me wrong , maybe in some early games the difference won't be worth the hassle , maybe on others it totaly will but as time goes real time ray tracing will for sure show superior results than rasterized rendering since the technology is simply superior .

You say that it is not ready for prime time , but how do you make it ready for prime time if nobody owns such a product ? Do you think game developpers will develope technology for a phantom product  ?
The answer is obvious !


----------



## PerfectWave (Nov 14, 2018)

so playing an fps game with a 4k monitor at 1080p at 40-60fps LUL


----------



## R0H1T (Nov 14, 2018)

Shatun_Bear said:


> What did the *Tom's Hardware* editor say again?
> 
> 'Just buy one, why wait to experience ray tracing when you can have it now before all your friends'. Presumably he was talking about barely noticeable ray tracing 1080p gaming at 30 fps....


He also said something like ~ *whether you'd prefer last moments (of life) flashing before your eyes in RT RT or not *
I mean seriously he must've been zombified at that moment


----------



## stimpy88 (Nov 14, 2018)

[XC] Oj101 said:


> The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.
> 
> When the Voodoo was released was there widespread support for Glide? No. When the GeForce 256 was released was there widespread support for hardware T&L? No. When the GeForce 3 came out was there widespread support for programmable shaders? No. Why do you expect any different from this?
> 
> ...


Wow, you really like to write essays don’t you!  Or is is a copy and paste from the nGreedia NPC software update?

Maybe you should work on trying to comprehend for yourself why people don’t like this situation before you write another marketing rant?


----------



## Aldain (Nov 14, 2018)

londiste said:


> Downgraded? Their tech demo ran reflections at 4 times the resolution of the previous rasterized implementation.




yes downgraded from the demo

https://www.tomshardware.com/news/battlefield-v-ray-tracing,37732.html

*Battlefield V Creators: We Toned Down Ray Tracing for Performance, Realism*


----------



## btarunr (Nov 14, 2018)

ikeke said:


> Could the RT cores be memory limited? They seem to be from 1080p teaser image..



Don't know about that yet. The teaser graph contains data from RTX 2080 Ti, which already has the highest bandwidth in the market. After all this is over, we'll see how memory OC affects this.


----------



## moproblems99 (Nov 14, 2018)

Forget the performance, I don't even like how it looks it many areas.  Anyone with half a brain saw this coming because it was reported that it killed FPS so no surprise.  RTX is still some time away but at least that $1300 bought some 30% performance increase without the RTX that will have to be disabled so it wasn't a complete waste of people's money.


----------



## PerfectWave (Nov 14, 2018)

"The more you buy, the more you save "......


----------



## Gasaraki (Nov 14, 2018)

I hate these "patch" that add technology to a game. If the game was not design with this built in, the performance is not going to be good. Just look at all the games that have DX12 as a "patch" vs. games that are developed with DX12 natively.


----------



## INSTG8R (Nov 14, 2018)

RH92 said:


> Fist of all to make this conclusion you have to compare the graphical quality of 1080p with RTX enabled and 4K without RTX enabled . As far as i know we don't have such a comparison so you can't make such a conclusion . Don't get me wrong , maybe in some early games the difference won't be worth the hassle , maybe on others it totaly will but as time goes real time ray tracing will for sure show superior results than rasterized rendering since the technology is simply superior .
> 
> You say that it is not ready for prime time , but how do you make it ready for prime time if nobody owns such a product ? Do you think game developpers will develope technology for a phantom product  ?
> The answer is obvious !


Well “regular” BF5 performance numbers are readily available so I’m not sure why you’re trying to split hairs 2080ti gets 83 FPS at 4K Ultra vs the initial numbers of 65FPS at 1080 Ultra the only graphical difference being the addition of RT How can I not form a conclusion from that? Do you not think that the DICE Dev team aren’t all using RTX products for dev work? This is as good as it can be in its current form with the hardware available. 
Let’s be clear I’m not hating on RT I’m saying it’s not ready for the consumer space with current hardware. Like all new tech it takes a generation at least before it becomes readily accessible.


----------



## stimpy88 (Nov 14, 2018)

[XC] Oj101 said:


> Maybe you should read my full post before replying, such as:
> 
> 
> 
> ...


So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine?  Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.  They must have been rolling around on their office floors pissing themselves with laughter while they were watching the sales figures counting up minute by minute.

And to help your comprehension, nobody has ever said that RTX technology is a bad idea.  It’s the way nGreedia have gone about introducing it that has caused all this hatred.  You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.


----------



## btarunr (Nov 14, 2018)

RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.


----------



## [XC] Oj101 (Nov 14, 2018)

stimpy88 said:


> So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine?  Yeah, the nGreedia is strong with this one.
> 
> Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.



Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



stimpy88 said:


> And to help your comprehension, nobody has ever said that RTX technology is a bad idea.  It’s the way nGreedia have gone about introducing it that has caused all this hatred.  You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?


----------



## Aldain (Nov 14, 2018)

[XC] Oj101 said:


> Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.
> 
> 
> 
> So how exactly did NVIDIA introduce this that was so wrong in your opinion?



"It just works" a blatant marketing HUANG lvl of LIE


----------



## ikeke (Nov 14, 2018)

btarunr said:


> RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.


That seems to suggest theres a memory bottleneck....


----------



## Bytales (Nov 14, 2018)

stimpy88 said:


> I love to say it...   I told you so!
> 
> nGreedia...  The way you're meant to be played! Fools.
> 
> ...



Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.


----------



## stimpy88 (Nov 14, 2018)

[XC] Oj101 said:


> Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.
> 
> 
> 
> So how exactly did NVIDIA introduce this that was so wrong in your opinion?


I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up.  I never said anything of the sort.  You really are not good with this, you really need to connect to the nGreedia server, and get your software updated...  You said that we have to wait for game devs to write an engine that supports RTX from the ground up.  I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?


----------



## [XC] Oj101 (Nov 14, 2018)

stimpy88 said:


> I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?
> 
> Stop making arguments up.  I never said anything of the sort.  You really are not good with this, you really need to connect to the nGreedia server, and get your software updated...  You said that we have to wait for game devs to write an engine that supports RTX from the ground up.  I told you the timeframe that would happen in.
> 
> I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?





> and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.



So then what *exactly* are you saying here?


----------



## Pruny (Nov 14, 2018)

Bytales said:


> Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.
> 
> And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.
> 
> Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.


For me 0 rays are enough, make separate cards for rtx only tensors.


----------



## pky (Nov 14, 2018)

btarunr said:


> RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.


I was thinking... if you're using Ray Tracing, do the "regular" shadows/lighting/reflections settings affect visual quality, and do they affect FPS? If they don't affect quality (considering RT takes care of that), but they do affect FPS - maybe lowering them/turning them off would boost FPS without quality loss?


----------



## Nioktefe (Nov 14, 2018)

ikeke said:


> That seems to suggest theres a memory bottleneck....


RTX 2070 has more bandwidth per TFLOPS than RTX2080ti

It would be interesting to see clocks and power consumption along with performance


----------



## INSTG8R (Nov 14, 2018)

[XC] Oj101 said:


> Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.
> 
> 
> 
> So how exactly did NVIDIA introduce this that was so wrong in your opinion?





[XC] Oj101 said:


> So then what *exactly* are you saying here?


Well you’re the one trying to justify it in it’s current state despite the facts that it’s not the the amazing miracle tech you continue to claim that it is. What part of its not ready do you need explained to you again? The numbers don’t lie and it’s only going to get worse with the 2080 and 2070. How are you going to try to spin those numbers favourably?
Bottom line the hardware DOESN’T exist.


----------



## londiste (Nov 14, 2018)

https://www.hardwareluxx.de/index.p...benchmarks-battlefield-v-mit-dxr-im-test.html


----------



## AMX85 (Nov 14, 2018)

looking all settings performance, looks like an GPU Engine bug, is´nt using all Cores or someting, Drivers and game will fix it, but i still expecting performance lost, maybe 20-30%


greetings


----------



## GFalsella (Nov 14, 2018)

Vayra86 said:


> Actually I'm more inclined to say this is Hairworks 2.0. Barely visible improvements at major performance hit, but don't worry, you can make it super low quality to have decent performance. PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.


Are you stupid, legally blind or just trolling? I get all the skepticism towards ray tracing, i thinks it's way to early for it too but comparing a totally game changing tech with hairworks saying it's barely visible is not a statement one can take serious. This is more like 2D->3D than hairworks but ok, guess hating is easier. (don't get me wrong, it's right to hate on nvidia giving their shitty business practices but rtx != nvidia.


----------



## kastriot (Nov 14, 2018)

How about performance @720p it should be OK i guess and next gen cards will do 1080p and after that 3rd gen card will do 1440p and then..


----------



## Steevo (Nov 14, 2018)

Why didn't you but two cards? Seriously, did you peons expect this to work with just one? If you can afford one 1300 card that means you can afford 2, and what's money anyway when it comes to having the best of the best so your epeen is huge....


----------



## BigMack70 (Nov 14, 2018)

I feel bad for anyone who got suckered by Nvidia's marketing and bought 20 series specifically for ray tracing. 

Only one card in the 20 series - the 2080 Ti - has a real reason to be purchased, and that is specifically for maximum possible 4k rasterized gaming performance. All the 20 series cards are (and always were going to be) hot garbage for ray tracing.  

If you want ray tracing, wait 1-3 years for 2nd and 3rd generation GPUs designed for it.


----------



## SIGSEGV (Nov 14, 2018)

PerfectWave said:


> "The more you buy, the more you save "......


quote of the day. 

it's hilarious, seriously. TBH, I don't have any reason to buy such a ...they said a 'luxury card' with let say 'RTX' technology (just sound oh wow~ to me) but it sacrifices the performance almost half. really?
It just doesn't make sense. I am so happy and glad about my second-hand 1080Ti purchase.


----------



## Vayra86 (Nov 14, 2018)

GFalsella said:


> Are you stupid, legally blind or just trolling? I get all the skepticism towards ray tracing, i thinks it's way to early for it too but comparing a totally game changing tech with hairworks saying it's barely visible is not a statement one can take serious. This is more like 2D->3D than hairworks but ok, guess hating is easier. (don't get me wrong, it's right to hate on nvidia giving their shitty business practices but rtx != nvidia.



Got some in-game comparison screens to make your point, or did you specifically make an account here to shitpost without substance?

As always, I'm open to any source material you can provide, and ready to change my opinion. Proof is in the pudding and so far, I can't say it is more than I said it was.

So far what I have seen from live RTRT is that I really don't prefer the ray traced image to the 'old' rasterized image. I've seen Metro with scenes where you get blinded by sunlight and can't see half of the scene, I've seen some zoomed in RTX ON showcase material of a reflection in a car, a low poly-man on the moon techdemo, and I've seen some ultra-low FPS BFV content with low quality RTRT in it. If you have other examples, go at it.

Also... RTX =! Nvidia?! I think you're confused with DXR? RTX is literally the most blatantly visible shitty business practice Nvidia has employed in the last ten years.


----------



## OneMoar (Nov 14, 2018)

wow shocking COMPLETELY Shocked didn't see this coming at all .... 

just buy it ... smh


----------



## [XC] Oj101 (Nov 14, 2018)

INSTG8R said:


> Well you’re the one trying to justify it in it’s current state despite the facts that it’s not the the amazing miracle tech you continue to claim that it is. What part of its not ready do you need explained to you again? The numbers don’t lie and it’s only going to get worse with the 2080 and 2070. How are you going to try to spin those numbers favourably?
> Bottom line the hardware DOESN’T exist.



So please let me hear how you propose it would "be ready"?


----------



## OneMoar (Nov 14, 2018)

[XC] Oj101 said:


> So please let me hear how you propose it would "be ready"?


not crash the system
not halve frame-rates
work as advertised ?

just those 3 ...


----------



## INSTG8R (Nov 14, 2018)

And also let’s be clear this is DXR not any NV proprietary stuff. So it’s the DX12 “Standard” RT that AMD could make use of. I wonder if my Vega could leverage its Conpute units for example.


----------



## Aldain (Nov 14, 2018)

[XC] Oj101 said:


> So please let me hear how you propose it would "be ready"?




Because HUANG said

"It just works"

Sorry m8 you have no ground to stand on


----------



## R0H1T (Nov 14, 2018)

Vayra86 said:


> Got some in-game comparison screens to make your point, or did you specifically make an account here to shitpost without substance?
> 
> As always, I'm open to any source material you can provide, and ready to change my opinion. Proof is in the pudding and so far, I can't say it is more than I said it was.
> 
> ...


Bling, bling, bling 

It's obvious none have ever thought of the downsides of blinding shiny new god rays, yeah let's all wear gunnar shades while gaming. Next thing you know, we'll have to turn the brightness down to zero on all our shiny HDR monitors


----------



## [XC] Oj101 (Nov 14, 2018)

OneMoar said:


> not crash the system
> not halve frame-rates
> work as advertised ?
> 
> just those 3 ...



1. That's on DICE, not NVIDIA.
2. We don't have a single native ray traced game available. How do you propose we get there without the hardware to back it?
3. It does - it allows real time ray tracing.



Aldain said:


> Because HUANG said
> 
> "It just works"
> 
> Sorry m8 you have no ground to stand on



Are you purposefully being obtuse or do you genuinely not understand that the developers still have to implement it?


----------



## INSTG8R (Nov 14, 2018)

[XC] Oj101 said:


> So please let me hear how you propose it would "be ready"?


I have said it many times the hardware needs to be improved because this iteration clearly isn’t capable of the simplest implementation. DX12 DXR requires more horsepower than the current hardware can provide, period, end of story. I just got and HDR capable monitor, I’m quite enjoying the lighting differences I’ve seen and it doesn’t require me to sacrifice 40% of my performance to do it.


----------



## Vayra86 (Nov 14, 2018)

R0H1T said:


> Bling, bling, bling
> 
> It's obvious none have ever thought of the downsides of blinding shiny new god rays, yeah let's all wear gunnar shades while gaming. Next thing you know, we'll have to turn the brightness down to zero on all our shiny HDR monitors



1000 nits in your face biatch! This takes griefing to a whole new level  Flashlights are going to be top of the food chain.


----------



## mouacyk (Nov 14, 2018)

Please add some salt in the TPU review, in the form of relative 1080 TI raster performance.


----------



## [XC] Oj101 (Nov 14, 2018)

INSTG8R said:


> I have said it many times the hardware needs to be improved because this iteration clearly isn’t capable of the simplest implementation. DX12 DXR requires more horsepower than the current hardware can provide, period, end of story. I just got and HDR capable monitor, I’m quite enjoying the lighting differences I’ve seen and it doesn’t require me to sacrifice 40% of my performance to do it.



I have to ask which natively ray tracing capable engine you’re using to come to this conclusion? The hardware is capable, the software needs to catch up. This has happened with every generation of console, where the launch games look nowhere near as good as the games further down the line. The hardware doesn’t change, the software does. Or are you blaming NVIDIA for the developers not having the experience yet? You’re bashing NVIDIA’s technical merits based on the ineptitude of the developers.


----------



## Aldain (Nov 14, 2018)

[XC] Oj101 said:


> 1. That's on DICE, not NVIDIA.
> 2. We don't have a single native ray traced game available. How do you propose we get there without the hardware to back it?
> 3. It does - it allows real time ray tracing.
> 
> ...



Nope.. It was a marketing hype that fell falt on its face. There is no saving grace for Turing..


----------



## ikeke (Nov 14, 2018)

[XC] Oj101 said:


> 1. That's on DICE, not NVIDIA.
> 2. We don't have a single native ray traced game available. How do you propose we get there without the hardware to back it?
> 3. It does - it allows real time ray tracing.
> 
> ...


Rgd #2, Claybook ise out for some time. So actually one can do RT without Nvidia magicsauce...


__ https://twitter.com/i/web/status/976132464337764352


----------



## lexluthermiester (Nov 14, 2018)

Seems like a bunch of cheese needs to be passed around..


Aldain said:


> Nope.. It was a marketing hype that fell falt on its face. There is no saving grace for Turing..


For the record, the problems being experienced with BFV *are exclusively EA's fault*, not NVidia's. Like anyone should be surprised that EA FUBAR's a game release...


----------



## INSTG8R (Nov 14, 2018)

[XC] Oj101 said:


> I have to ask which natively ray tracing capable engine you’re using to come to this conclusion? The hardware is capable, the software needs to catch up. This has happened with every generation of console, where the launch games look nowhere near as good as the games further down the line. The hardware doesn’t change, the software does. Or are you blaming NVIDIA for the developers not having the experience yet? You’re bashing NVIDIA’s technical merits based on the ineptitude of the developers.


You keep saying software but you’ve got it backwards. This DX12 DXR is anew “standard” API that anyone can leverage. This is pure hardware limitation. They’ve already had different levels of DXR  and even the lowest still shows severe performance limitations There’s nothing wrong with the software when you can turn it all the way down and still see subpar performance. It’s ALWAYS been the hardware that catches up to the software. Remember Tesselation? How about the DX jumps 9-10-11 The APIs haven’t changed the hardware has been improved to leverage the new APIs/features always been that way and hasn’t changed just  because you say so. 
Why you keep getting it wrong as an excuse for it’s current failure is getting quite humorous at this point


----------



## Aldain (Nov 14, 2018)

lexluthermiester said:


> Seems like a bunch of cheese needs to be passed around..
> 
> For the record, the problems being experienced with BFV *are exclusively EA's fault*, not NVidia's. Like anyone should be surprised that EA FUBAR's a game release...



wrong... nvidia has a major role to play in this.. it was nvidia that worked CLOSELY with dice on this.. this is as much as nvidia FAIL as DICE-S.. Stop with the meat-shield mentality


----------



## Vya Domus (Nov 14, 2018)

[XC] Oj101 said:


> The hardware is capable



Clearly it's not. The RT cores are capable of X amount of rays, no software will ever change that, it's a hardware limit.

You seriously need to back down on the Nvidia koolaid.


----------



## Vayra86 (Nov 14, 2018)

lexluthermiester said:


> Seems like a bunch of cheese needs to be passed around..
> 
> For the record, the problems being experienced with BFV *are exclusively EA's fault*, not NVidia's. Like anyone should be surprised that EA FUBAR's a game release...



Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher. The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers, of which I'm sure they have a couple walking around.

Let's be honest here, DICE has borked the last four Battlefield releases. BF3, BF4, Hardline and this one. Only BF1 had a somewhat smooth launch. The pattern here is with DICE, not EA. EA's only influence might have been planning/release date, but then again, wasn't it already delayed once?

Time will tell if this was purely a time constraint or a talent/coding problem. The DX12 build for BFV was already shaky as hell and DICE needed years to fix their netcode.

Its also too simple to say its not Nvidia's fault, given their engineers that do help out with these projects. Apparently what Nvidia provides in terms of *tooling* is far from bug free.


----------



## GFalsella (Nov 14, 2018)

Vayra86 said:


> Got some in-game comparison screens to make your point, or did you specifically make an account here to shitpost without substance?
> 
> As always, I'm open to any source material you can provide, and ready to change my opinion. Proof is in the pudding and so far, I can't say it is more than I said it was.
> 
> ...


There would be no need for proof if you was more informed about the subject. I suggest you read something about the different techniques of rendering, difference between rasterization/raytracing/path tracing and not only see what it means for the end customer (which already now, with the sad tech demos we were shown are more than visible) but also what it means for the creators. This is truly game changing and will have huge effects on the industry. The only problem is that nvidia pushed for it way too early in order to make up for the lack of traditional performance upgrade and many people now think it's just a gimmick. (because let's be honest, that's what it is for now) Give it 5 more years and even your untrained eye will see the difference. Sorry if i was so aggressive in my formulation, in retrospective that was uncalled for. And with rtx i mean ray tracing which is a technique that exists since decades, not nvidias gpu series or their temporary exclusive support.


----------



## Vayra86 (Nov 14, 2018)

GFalsella said:


> There would be no need for proof if you was more informed about the subject. I suggest you read something about the different techniques of rendering, difference between rasterization/raytracing/path tracing and not only see what it means for the end customer (which already now, with the sad tech demos we were shown are more than visible) but also what it means for the creators. This is truly game changing and will have huge effects on the industry. The only problem is that nvidia pushed for it way too early in order to make up for the lack of traditional performance upgrade and many people now think it's just a gimmick. (because let's be honest, that's what it is for now) Give it 5 more years and even your untrained eye will see the difference. Sorry if i was so aggressive in my formulation, in retrospective that was uncalled for. And with rtx i mean ray tracing which is a technique that exists since decades, not nvidias gpu series or their temporary exclusive support.



Oh but I have no doubts about ray tracing itself, it is like you say ALREADY an established tech for content creators. So then I say, what's new here? Nothing. RTX is vaporware and content creators get Quadro. Don't assume I don't know what it is...

What I have doubts about is doing it in a live scene and in fast paced gameplay where performance is key, and I have doubts about the timing and way RTX is implemented. There is no outlook on any kind of reasonable adoption rate in the near future. Consoles don't have it. Low-mid range doesn't have it. Only grossly overpriced, underperforming high end cards have it, from one single company. De facto monopolized tech with no market share is not something to put your bets on. This isn't CUDA or something.

Even still, RTRT in this implementation is just some low quality passes tacked on to rasterized rendering. It is nowhere near the realism you'd want to extract from it. Its just a minor graphical upgrade with a massive performance hit  - much like Hairworks


----------



## Slizzo (Nov 14, 2018)

Vayra86 said:


> Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher. The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers, of which I'm sure they have a couple walking around.
> 
> Let's be honest here, DICE has borked the last four Battlefield releases. BF3, BF4, Hardline and this one. Only BF1 had a somewhat smooth launch. The pattern here is with DICE, not EA. EA's only influence might have been planning/release date, but then again, wasn't it already delayed once?
> 
> ...



Wait, how was BFV's launch not smooth? I've been playing since last Thursday and it's been smooth as all hell for me. Haven't had any crashes, haven't had any bugs.

Granted I haven't enabled DX12, but I haven't for any game yet as it always has lowered performance on games, especially BF1 and now BFV (since it's using the same engine).


----------



## Vayra86 (Nov 14, 2018)

Slizzo said:


> Wait, how was BFV's launch not smooth? I've been playing since last Thursday and it's been smooth as all hell for me. Haven't had any crashes, haven't had any bugs.
> 
> Granted I haven't enabled DX12, but I haven't for any game yet as it always has lowered performance on games, especially BF1 and now BFV (since it's using the same engine).



DX11 is smooth, indeed. But for RT you do need DX12. I'm not surprised they finally got a DX11 release right, its about time isn't it? Even so it was postponed at the very last minute.


----------



## GFalsella (Nov 14, 2018)

BigMack70 said:


> I feel bad for anyone who got suckered by Nvidia's marketing and bought 20 series specifically for ray tracing.
> 
> Only one card in the 20 series - the 2080 Ti - has a real reason to be purchased, and that is specifically for maximum possible 4k rasterized gaming performance. All the 20 series cards are (and always were going to be) hot garbage for ray tracing.
> 
> If you want ray tracing, wait 1-3 years for 2nd and 3rd generation GPUs designed for it.


Actually 2080 makes a lot of sense to buy even for traditional purposes. Outside of the US 1080ti go for around 100 bucks more than a 2080. It makes sense for the wrong reason but it does.


----------



## lexluthermiester (Nov 14, 2018)

Aldain said:


> wrong... nvidia has a major role to play in this..


They provided the documentation, limited dev tools and consultation, sure.


Aldain said:


> it was nvidia that worked CLOSELY with dice on this..


But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.


Aldain said:


> this is as much as nvidia FAIL as DICE-S.. Stop with the meat-shield mentality


Wrong. This is exclusively the failure of DICE and EA because..


Vayra86 said:


> Wow wow wow. Hold on. DICE is still not EA. Dev versus publisher.


EA still green lit the release before proper testing was done or completed.


Vayra86 said:


> The problems experienced are entirely creditable to DICE's programming work and/or collaborative work with Nvidia engineers


That first part is right. However, if NVidia's engineers were present, they were there only in an advisory capacity. 

NVidia's engineers are like teachers, they can only teach so much, it's up to the students to pay attention, take the time to practice and work hard to get things right. DICE and EA didn't do so and it shows. This unpleasantness is not a failure of NVidia's tech at all. The ultimate accountability is on DICE and EA and *ONLY* those two.


----------



## Armagg3don (Nov 14, 2018)

L33t said:


> Try disabling SLI. I have a feeling RTX is not working with SLI. Specially since *DX12 never worked with SLI* and DX12 is a requirement for RTX.
> 
> Dice will have to add support for DX12 Explicit Multi-GPU..


You are wrong, SLI support depends on the developer and not on the API (in this case DX12).
There are DX12 games that do support SLI, like GoW4 and RoTTR, and I must say that the support is quite good.


----------



## Vayra86 (Nov 14, 2018)

lexluthermiester said:


> They provided the documentation, limited dev tools and consultation, sure.
> 
> But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.
> 
> ...



Let's not split hairs, at least you already backed down to EA + DICE. I'm good with that. No matter the accountability though, these early tech projects are still collaborative projects. I'm quite sure even Nvidia's engineers still take away valuable info from these field tests and push improvements based on it. And as long as they provide a codebase, they have a responsibility and there is no way its perfect from the get-go.

I work with software suppliers too (SaaS applications) and I've also seen the first implementations of it in the field, and this is always how it works out. There is no single company arrogant enough to say their code is perfect, and that alone prompts change.


----------



## Slizzo (Nov 14, 2018)

Armagg3don said:


> You are wrong, SLI support depends on the developer and not on the API (in this case DX12).
> There are DX12 games that do support SLI, like GoW4 and RoTTR, and I must say that the support is quite good.



He may have been directly referencing Battlefield with his comment.


----------



## TheoneandonlyMrK (Nov 14, 2018)

[XC] Oj101 said:


> Maybe you should read my full post before replying, such as:
> 
> 
> 
> ...


Your sounding like you don't understand it ,this is Not full raytracing its lighting and shadows based on ray paths, an engine made for Rtx will stll be s rasterizing engine.
One game doesn't write it off i agree.

The fact it costs so much does, most buy a balanced pc so high end is normally a pretty decent pc attached to a decent monitor able now to play any game at 1440p or 4k ,that pc was not made to show 1080p in most owners eyes.


----------



## btarunr (Nov 14, 2018)

Article now posted. https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/

We will add more details in the next few hours, such as memory usage, and performance scaling with OC.

We'd love your comments.


----------



## INSTG8R (Nov 14, 2018)

lexluthermiester said:


> They provided the documentation, limited dev tools and consultation, sure.
> 
> But it's still the full responsibility for DICE to get things right. They are the dev house, not NVidia.
> 
> ...


Except you’re missing the fact that this DX DXR not an Nvidia tech just a new DX12 feature so it of course it’s all exposed to Devs and anyone who cares to use it  AMD could use it too and have access to all the same libraries. It’s a “standard” DX12 feature now maybe DICE didn’t get it perfect but it’s not Nvidia “special sauce”


----------



## lexluthermiester (Nov 14, 2018)

Vayra86 said:


> Let's not split hairs, at least you already backed down to EA + DICE.


You what I meant though.


Vayra86 said:


> I'm quite sure even Nvidia's engineers still take away valuable info from these field tests and push improvements based on it.


Very likely. Every failure can be a great opportunity to learn!



INSTG8R said:


> Except you’re missing the fact that this DX DXR not an Nvidia tech just a new DX12 feature so it of course it’s all exposed to Devs and anyone who cares to use it AMD could use it too and have access to all the same libraries. It’s a “standard” DX12 feature now maybe DICE didn’t get it perfect but it’s not Nvidia “special sauce”


That's a very good point actually.


----------



## GFalsella (Nov 14, 2018)

Vayra86 said:


> Oh but I have no doubts about ray tracing itself, it is like you say ALREADY an established tech for content creators. So then I say, what's new here? Nothing. RTX is vaporware and content creators get Quadro. Don't assume I don't know what it is...
> 
> What I have doubts about is doing it in a live scene and in fast paced gameplay where performance is key, and I have doubts about the timing and way RTX is implemented. There is no outlook on any kind of reasonable adoption rate in the near future. Consoles don't have it. Low-mid range doesn't have it. Only grossly overpriced, underperforming high end cards have it, from one single company. De facto monopolized tech with no market share is not something to put your bets on. This isn't CUDA or something.
> 
> Even still, RTRT in this implementation is just some low quality passes tacked on to rasterized rendering. It is nowhere near the realism you'd want to extract from it. Its just a minor graphical upgrade with a massive performance hit  - much like Hairworks


Ok, if you put it that way i can really agree with you. Too many people don't understand and start to treat it like "silly rays nice reflections" without knowing what they are talking about. I think you touched a very important point: consoles. I really doubt ps5 and others will be capable of ray tracing and this will nullify all advantages for software houses (turning it into extra work for pc portings) so we won't see the actual advantages im hoping for during this decade.


----------



## Armagg3don (Nov 14, 2018)

Slizzo said:


> He may have been directly referencing Battlefield with his comment.


Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.


----------



## INSTG8R (Nov 14, 2018)

Armagg3don said:


> Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.


Apparently it’s coming.


----------



## btarunr (Nov 14, 2018)

ikeke said:


> That seems to suggest theres a memory bottleneck....



Memory overclock improves RTX performance more than non-RTX performance. We saw a 5% improvement for RTX-on vs. 2% improvement for RTX-off at a given overclocked memory speed. However, we can't go too far with RTX 2080 Ti memory OC without having to play with advanced settings/cooling.


----------



## B-Real (Nov 14, 2018)

Funny this is. Barely 60 fps with RTX with a 1200$ card in probably the best optimized game on the market.


----------



## TheoneandonlyMrK (Nov 14, 2018)

Armagg3don said:


> Maybe, but then it's too soon to talk about not supporting SLI in DX12 for this game because it' s only a few days since it was released.


Head over to Nvidia forums and you will learn just how great sli on dx12 is, ie it is not really supported , aggressively or proper, and i doubt it's really sli and not explicit Multi Gpu as it should be too, what's in a name after all.

Why everyone defends this shits beyond me.

And It is not too soon they worked on this for years (3-5)not fecking days, ,if it does or not is of no matter to 99% of Rtx owner's who are not going to buy two.

How many Devs work on a thing for 0.0002% approximately of their audience.

Rtx just died for three more years imho, at best.


----------



## Konceptz (Nov 14, 2018)

INSTG8R said:


> Nah Man we couldn’t even do that. I mean seeing as NV is still using the same ancient CP you can still select what runs PhysX even now
> View attachment 110500




I"m sorry but this is friggin hilarious LOL


----------



## ikeke (Nov 14, 2018)

btarunr said:


> Memory overclock improves RTX performance more than non-RTX performance. We saw a 5% improvement for RTX-on vs. 2% improvement for RTX-off at a given overclocked memory speed. However, we can't go too far with RTX 2080 Ti memory OC without having to play with advanced settings/cooling.


Thanks!

Is there any difference in temperatures or power consumption with RTX on/off?

Any changes to core frequency? 

Could it be power limited - so that in order to run RTX cores other areas of chip are power throttled or lose some headroom?


----------



## mouacyk (Nov 14, 2018)

[XC] Oj101 said:


> Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.
> 
> So how exactly did NVIDIA introduce this that was so wrong in your opinion?


I honestly believe that customers shouldn't be paying for games that aren't made yet, either.


----------



## londiste (Nov 14, 2018)

ikeke said:


> Is there any difference in temperatures or power consumption with RTX on/off?
> Any changes to core frequency?
> Could it be power limited - so that in order to run RTX cores other areas of chip are power throttled or lose some headroom?


Same temperatures, same power consumption, same frequency.
Sounds logical that using RT cores throttle other parts of the GPU but it's not directly visible in clocks or power.


----------



## Majikaru (Nov 14, 2018)

londiste said:


> Horrifying?
> So you mean it does 80+FPS at 1080p with DXR at highest possible setting?
> Assuming the same (less than) 50% hit, it'll do 60+FPS at 1440p, and 40+FPS at UHD 4K.



What you mean. There's a chart showing RTX turned on goes from 158 fps to 65 FPs at 1080p. You definitely aren't getting 60+ FPS at 1440p.


----------



## londiste (Nov 14, 2018)

Majikaru said:


> What you mean. There's a chart showing RTX turned on goes from 158 fps to 65 FPs at 1080p. You definitely aren't getting 60+ FPS at 1440p.


Well, the initial news post has a unique understanding of close to 50% in light of the results that were updated and published later


----------



## xorbe (Nov 14, 2018)

Personally I don' think this feature is going to win over users on a fast paced shoot 'em up game.  Needs a single player slow paced game like Portal 3 or something.


----------



## DirtbagDave (Nov 14, 2018)

I'm just curious if anyone was actually going to use this tech to begin with. I got myself an RTX card but was not even remotely looking forward to this technology, we all knew the performance in actual gameplay was going to be horrible.


----------



## Xuper (Nov 14, 2018)

Told ya.it's still too early to process DXR. Probably 2025 or even more.We really need to bring DXR into middle range cards which is more like equal to 2080Ti's Performance. I think 2025 is year to go.I saw some screenshot , It's really nice.


----------



## Prima.Vera (Nov 15, 2018)

Just a thought, nobody would have bitched about the crappy RTRT performance of the cards in any of the games, if the cards would have been 40% cheaper, like they should normally be priced. If not 50% less. Asking PREMIUM for something that doesn't work properly it's completely disrespectful and insulting. But we already know we cannot expect anything better from a typical greedy and scrupulous corporation like nGreedia.


----------



## moproblems99 (Nov 15, 2018)

Prima.Vera said:


> Just a thought, nobody would have bitched about the crappy RTRT performance of the cards in any of the games, if the cards would have been 40% cheaper, like they should normally be priced. If not 50% less. Asking PREMIUM for something that doesn't work properly it's completely disrespectful and insulting. But we already know we cannot expect anything better from a typical greedy and scrupulous corporation like nGreedia.



Sadly, it didn't stop anyone from buying them.  And honestly, the people that are complaining probably don't own one.  And really, no one should be complaining because it was widely reported pre-launch that performance was going to tank hard.

Even if performance didn't fall through the floor, I am not sure I like how it looks.  Perhaps with MOAR rays it will look better?  Everything looks like it is made of glass and sort of comes out less realistic...

*Edited for grammar


----------



## Th3pwn3r (Nov 15, 2018)

It's as if the majority of people crapping on turing don't know that you don't have to ray tracing. You can still have a greatly improved 4k gaming experience by paying the premium price.  If I bought a 2080ti I wouldn't care about ray tracing but the slight bump up in FPS @ 4k resolution.


----------



## eidairaman1 (Nov 15, 2018)

Th3pwn3r said:


> It's as if the majority of people crapping on turing don't know that you don't have to ray tracing. You can still have a greatly improved 4k gaming experience by paying the premium price.  If I bought a 2080ti I wouldn't care about ray tracing but the slight bump up in FPS @ 4k resolution.



This is the big feature nvidia has been trying to push.

turing peh, more like Turding.


----------



## INSTG8R (Nov 15, 2018)

Th3pwn3r said:


> It's as if the majority of people crapping on turing don't know that you don't have to ray tracing. You can still have a greatly improved 4k gaming experience by paying the premium price.  If I bought a 2080ti I wouldn't care about ray tracing but the slight bump up in FPS @ 4k resolution.


Yeah we take RTRT out of the equation it’s a good card but still overpriced and moerso now that we’ve seen is party piece in action.


----------



## eidairaman1 (Nov 15, 2018)

INSTG8R said:


> Yeah we take RTRT out of the equation it’s a good card but still overpriced and moerso now that we’ve seen is party piece in action.



Yup nvidia more like ngreedia


----------



## londiste (Nov 15, 2018)

@eidairaman1 , @INSTG8R , you know this is getting really old already. You don't have to buy a Turing card nor play around with DXR.

Turings are not that bad of a value by itself. Except for 2080Ti which is flagship money, Turings are a bit better perf/$ than Pascals. I know everyone expected the usual thing of getting previous high end performance for midrange money but isn't it time to get over it already?

Especially since pretty much everyone with constant whining is not running Turing cards anyway.

/rant


----------



## INSTG8R (Nov 15, 2018)

londiste said:


> @eidairaman1 , @INSTG8R , you know this is getting really old already. You don't have to buy a Turing card nor play around with DXR.
> 
> Turings are not that bad of a value by itself. Except for 2080Ti which is flagship money, Turings are a bit better perf/$ than Pascals. I know everyone expected the usual thing of getting previous high end performance for midrange money but isn't it time to get over it already?
> 
> ...


 Loo:k it’s the most expensive flagship they’ve ever released advertisied as the most advanced they ever done with the capability to give us RTRT. Well, it’s expensive and advanced. I'm failing to see me not running one has anything to do with that. Defending it is no better when it’s “big feature” has proven to be just a proof of concept. That is where I take issue.


----------



## londiste (Nov 15, 2018)

Titan V was $2999 
2080Ti is most advanced, it has RTRT capability.
Everyone knew DXR comes with a considerable performance hit from day 1.

All the threads that are even remotely related to RTX - or even any GPU lately - will devolve into this useless crap these days.

Edit:
A larger problem is that DXR is standardized in DX12. There are very few DX12 games to begin with. Battlefield series has its own set of problems with DX12 with microstutters and it bing slow across the board. Shadow of Tomb Raider might be more interesting, its DX12 implementation is one of the best (next to Sniper Elite 4 and possibly Hitman) and it is a single-player game where eyecandy could be justifiable.


----------



## ikeke (Nov 15, 2018)

But..."it just works"..?


/s


----------



## INSTG8R (Nov 15, 2018)

londiste said:


> Titan V was $2999
> 2080Ti is most advanced, it has RTRT capability.
> Everyone knew DXR comes with a considerable performance hit from day 1.
> 
> ...


Let’s just leave out Titan  ITuring has achieved proof of concept. It can only get better with better hardware like any new tech.


----------



## intelzen (Nov 15, 2018)

it is not just halved, it is halved on LOW! on ultra it is x0.3...


----------



## bajs11 (Nov 15, 2018)

londiste said:


> Titan V was $2999
> 2080Ti is most advanced, it has RTRT capability.
> Everyone knew DXR comes with a considerable performance hit from day 1.
> 
> ...



you do know that the rtx 2080ti which is a card that was suppose to replace the gtx 1080ti was released with a 60% higher msrp
the gtx 1080Ti was released with the same msrp or close to as the gtx 980ti and offered some 60% better performance
TitanV is like Quadro, not meant for gaming


----------



## Th3pwn3r (Nov 15, 2018)

INSTG8R said:


> Loo:k it’s the most expensive flagship they’ve ever released advertisied as the most advanced they ever done with the capability to give us RTRT. Well, it’s expensive and advanced. I'm failing to see me not running one has anything to do with that. Defending it is no better when it’s “big feature” has proven to be just a proof of concept. That is where I take issue.



Let's all be honest. If the 2080ti was selling at $800 everyone would be going insane and buying them all up because of the performance is THE BEST. The only real problem is they're not $800. They're ridiculously overpriced. People are acting like this is a bigger failure than Vega which in my opinion it is not. However, I am in agreement that marketing of Turing was awful at best. The hate here is comical though, if the 2080ti drops under $950 I bet all these haters will be running to buy them up. I would upgrade in all honesty, 5-10FPS @ 4k is worth it to me


----------



## eidairaman1 (Nov 15, 2018)

Th3pwn3r said:


> Let's all be honest. If the 2080ti was selling at $800 everyone would be going insane and buying them all up because of the performance is THE BEST. The only real problem is they're not $800. They're ridiculously overpriced. People are acting like this is a bigger failure than Vega which in my opinion it is not. However, I am in agreement that marketing of Turing was awful at best. The hate here is comical though, if the 2080ti drops under $950 I bet all these haters will be running to buy them up. I would upgrade in all honesty, 5-10FPS @ 4k is worth it to me



800 is still ridiculously priced...


----------



## bajs11 (Nov 15, 2018)

agreed.
People keep forgetting that the gtx 1080ti was launched at 699 usd the same as the gtx 980ti while offering some 60% better performance
the rtx 2080ti only offers 30% improvement while launched with a msrp 85% higher than its predecessor


----------



## intelzen (Nov 15, 2018)

Th3pwn3r said:


> Let's all be honest. If the 2080ti was selling at $800 everyone would be going insane and buying them all up because of the performance is THE BEST. The only real problem is they're not $800. They're ridiculously overpriced. People are acting like this is a bigger failure than Vega which in my opinion it is not. However, I am in agreement that marketing of Turing was awful at best. The hate here is comical though, if the 2080ti drops under $950 I bet all these haters will be running to buy them up. I would upgrade in all honesty, 5-10FPS @ 4k is worth it to me


you are thinking as if the huge price is only minor inconvenience to us? like - if there are a good product - it is good product - period, no ifs or buts, and no one should cares about price? using your logic - can I walk in a store and select some good products and walk out with those, without paying? I do not want to pay, but that is only minor inconvenience here. I mean those are good products and I will make good use of those, why would anyone else see problem with that?
btw what about future, lets say 1-2 years forward, next release - rtx 3080 ti lets say another +30% better than rtx 2080 ti, and price 2000$, but who cares it is 30% better - so a good product period and we should be happy about it? I would, because I would walk out of store for 0,00 with that beast


----------



## Vayra86 (Nov 15, 2018)

Th3pwn3r said:


> Let's all be honest. If the 2080ti was selling at $800 everyone would be going insane and buying them all up because of the performance is THE BEST. The only real problem is they're not $800. They're ridiculously overpriced. People are acting like this is a bigger failure than Vega which in my opinion it is not. However, I am in agreement that marketing of Turing was awful at best. The hate here is comical though, if the 2080ti drops under $950 I bet all these haters will be running to buy them up. I would upgrade in all honesty, 5-10FPS @ 4k is worth it to me



Yeah no, those ideas only work on the internet where everyone and his dog is telling the world they are throwing money at their screen. Its the same as that TPU poll where '25% was either gaming on or going to game on 4K'. They aren't, and they aren't going to anytime soon.

Price is a primary concern despite all the marketing and hype we let ourselves get fooled with. An enthusiast forum is the worst possible place to gauge the sentiment with regards to price.

Ironically an enthusiast forum is also the only place you will find apologists for this horrible business practice Nvidia is employing with Turing. We're paying a high price for RTX and if we want to upgrade, we literally have no way to avoid it. And then people here happily say 'look, theý're selling cards, its going well!'...  Typical situation of 'in the land of the blind'.

I've said this before, and I will say it again: you have a real option of NOT BUYING something. Skip a gen, and you will find RTX to be far more reasonably priced either this gen or the next one. Thát is my main motvation for resisting this, and believe it or not, that sentiment does matter, even if some fools buy it anyway. Another possible advantage is that more effort will be put into optimization and quality of the experience, because that is another way to increase value.

If you really want RTRT to happen, now is the time to step on the brakes hard and force an adjustment. Turing's implementation is simply not viable.


----------



## dozenfury (Nov 15, 2018)

Thanks to TPU for the performance #'s.  This is a great example of why unbiased performance benchmarks are so important.

My takeaway from the performance numbers is that it's very clear RT simply isn't ready for prime-time use yet.  For me 2k and 60fps for PC gaming is a minimum acceptable threshold, and I'd bet the majority of PC gamers would say the same or even a higher minimum.  And looking at the numbers the only 20xx card that can even do above 60fps at 2k with RT on is the 2080ti, and even that is only with RT set to low.  ANY other scenario at 2k is below 60fps with RT enabled.  Basically, this is a repeat of NV pushing 4k gaming a few years ago with the 980ti's when the tech really wasn't there yet, so people spent lots of money only to find out that 4k gaming then meant about 20fps which wasn't practical at all.  And at 4k with RT even on low the 2080ti can't crack 50fps.  Cool tech, but it's definitely a generation or maybe even 2 away from being work considering.


----------



## Th3pwn3r (Nov 15, 2018)

eidairaman1 said:


> 800 is still ridiculously priced...



I disagree, $800 isn't a ridiculous price, plenty of people paid a lot more than that for 1080ti cards. If you really feel that way you can just buy older generations of cards for less. Unless you must have the absolute best then you have to pay a premium. There are plenty of people who spare no expense. An example being audiophiles, $1300 for a set of CAR SPEAKERS. It's silly but those companies making them know their market exists, Nvidia banks on those kind of customers right now it seems.


intelzen said:


> you are thinking as if the huge price is only minor inconvenience to us? like - if there are a good product - it is good product - period, no ifs or buts, and no one should cares about price? using your logic - can I walk in a store and select some good products and walk out with those, without paying? I do not want to pay, but that is only minor inconvenience here. I mean those are good products and I will make good use of those, why would anyone else see problem with that?
> btw what about future, lets say 1-2 years forward, next release - rtx 3080 ti lets say another +30% better than rtx 2080 ti, and price 2000$, but who cares it is 30% better - so a good product period and we should be happy about it? I would, because I would walk out of store for 0,00 with that beast



You might want to read my post again several times.


Vayra86 said:


> Yeah no, those ideas only work on the internet where everyone and his dog is telling the world they are throwing money at their screen. Its the same as that TPU poll where '25% was either gaming on or going to game on 4K'. They aren't, and they aren't going to anytime soon.
> 
> Price is a primary concern despite all the marketing and hype we let ourselves get fooled with. An enthusiast forum is the worst possible place to gauge the sentiment with regards to price.
> 
> ...



I totally agree, I play in 4k off of my 1080 and also have a 1080ti, I would like more FPS BUT the price has to come down a bit for me, I'm not desperate and I'm not wanting things for free either. I have $6000 into my car audio system but even I have my boundaries.


----------



## moproblems99 (Nov 15, 2018)

I don't have any intent on 4k either, I just don't game enough to justify all the expense.



Th3pwn3r said:


> I totally agree, I play in 4k off of my 1080 and also have a 1080ti, I would like more FPS BUT the price has to come down a bit for me, I'm not desperate and I'm not wanting things for free either. I have $6000 into my car audio system but even I have my boundaries.



Do you have a long commute?  How much time do you spend in your car?

Edit: Not judging, just curious as what would drive $6k into a system mostly used for driving to and from work.


----------



## eidairaman1 (Nov 15, 2018)

Th3pwn3r said:


> I disagree, $800 isn't a ridiculous price, plenty of people paid a lot more than that for 1080ti cards. If you really feel that way you can just buy older generations of cards for less. Unless you must have the absolute best then you have to pay a premium. There are plenty of people who spare no expense. An example being audiophiles, $1300 for a set of CAR SPEAKERS. It's silly but those companies making them know their market exists, Nvidia banks on those kind of customers right now it seems.
> 
> 
> You might want to read my post again several times.
> ...



My 290 VaporX bear in mind second to the best card (290X VaporX/8GB) was 460 in 2014. My 9700 pro all in wonder in 2002 was 400.

So the prices are still ridiculous over 500


----------



## OfficerTux (Nov 15, 2018)

Many people said "it was to be expected, that RTX effects had a high performance hit". To be honest, I did not. Because those RT cores are just there to process those RTX effects, so I would have expected a few percentage lost for the overhead, but not double digit numbers. To me it looks like either DICE did a very bad job at the DXR optimization, or NVIDIA has balanced those RTX cards completely wrong. As Techspot points out:



> It is interesting to note across these tests that we are being RT core limited here. The higher the resolution, the higher the performance hit using the Ultra DXR mode, to the point where playing at 4K is more than 4x faster with DXR off. This also plays out when we spot checked power consumption: the cards were running at consistently lower power with DXR on, *because the regular CUDA cores are being underutilized* at such a low framerate.


 Source

While the RTX effects are impressive, to me it looks more like a tech demo. NVIDIA will have to considerably increase the RT core count (like double or triple), or the technique has to get optimized a lot before this will become a real game changer. Also as a buyer of an RTX 2070 (which I am not), I would be rather disappointed, since those meager 36 RT cores make DXR almost useless.

Still I am excited to see what other upcoming games or DXR patches will bring to the table.

(Please ignore this post, if all of this has already been discussed before, I did not read all 7 pages before posting)


----------



## Fluffmeister (Nov 16, 2018)

londiste said:


> Titan V was $2999
> 2080Ti is most advanced, it has RTRT capability.
> Everyone knew DXR comes with a considerable performance hit from day 1.
> 
> ...



Hey, at the end of the day it's still cheaper than the R9 295X2 which launched at a whooping $1500:

https://www.techpowerup.com/reviews/AMD/R9_295_X2/

And as that was basically crossfire on a stick, poor or lacking crossfire support meant the card couldn't shine in every title either.

Just some perspective for the current prices.


----------



## Ravenmaster (Nov 16, 2018)

Vya Domus said:


> Nice way of trying to turn this into an AMD issue.
> 
> Everyone get your pitch and fork, it's because of that damn AMD you get to enjoy a cinematic sub 60 fps experience.
> 
> ...



Indirectly it is. If AMD did have a faster card than the RTX 2080 and another one that could beat the RTX 2080Ti then nvidia would be forced to sell cheaper. I don't know why you can't grasp that. Nvidia is price gouging because they have no competition.


----------



## Th3pwn3r (Nov 16, 2018)

moproblems99 said:


> I don't have any intent on 4k either, I just don't game enough to justify all the expense.
> 
> 
> 
> ...



My commute right now is about an hour each way. My system price also includes the proper charging system, 270 amp alternator, 2/0 awg power cables and Digital Designs products, basically higher end stuff but you can do it for less. Of course I did all the work, labor would have been thousands of dollars.



eidairaman1 said:


> My 290 VaporX bear in mind second to the best card (290X VaporX/8GB) was 460 in 2014. My 9700 pro all in wonder in 2002 was 400.
> 
> So the prices are still ridiculous over 500



Your argument is moot in my opinion. I drive an older Honda Civic at times but could buy a brand new Corvette, both will get me to work. 



Ravenmaster said:


> Indirectly it is. If AMD did have a faster card than the RTX 2080 and another one that could beat the RTX 2080Ti then nvidia would be forced to sell cheaper. I don't know why you can't grasp that. Nvidia is price gouging because they have no competition.



A lot of us get that. People want Nvidia to forget that business comes first. It would be nice if the prices were better but they don't care about us that much. If you buy something from your friend you probably get a good deal, we're not friends or Nvidia so we get charged whatever they can come up with that will work for them.


----------



## moproblems99 (Nov 16, 2018)

Ravenmaster said:


> I don't know why you can't grasp that. Nvidia is price gouging because they have no competition



That is simply not true.  A GPU is not a required purchase.  Nvidia is charging that much because people are buying them.  Even if no other manufacturer produced GPUs, if no one bought them, the price goes down.  The fact of the matter is, as long as people are buying, Nvidia is going to test how far they can go.  Frankly, I hope AMD slots in just under and starts reaping in the money to fill their coffers as well.

Until people make a decision of when too much is too much, prices will continue to rise.


----------



## Xzibit (Nov 17, 2018)

Blo3der-Kuh said:


> Many people said "it was to be expected, that RTX effects had a high performance hit". To be honest, I did not. Because those RT cores are just there to process those RTX effects, so I would have expected a few percentage lost for the overhead, but not double digit numbers. To me it looks like either DICE did a very bad job at the DXR optimization, or NVIDIA has balanced those RTX cards completely wrong. As Techspot points out:
> 
> Source
> 
> ...



RTX itself (very low sample rate + denoiser) is the optimization that is required for "Real-Time".  Its the same as if you took your favorite renderer (blender) and used 1spp to render with a denoiser. Its very low quality but gets the job done* as quick as possible*.

The optimization has to come in making those RT cores better and adding more of them. Just adding more RT cores right now will take up too much die space for not much benefit like the scaling from 2070 to 2080 TI. They need to be multiple times better x10+ so they wont be bogged down if games use more then one RTX effect.


----------



## VPII (Nov 18, 2018)

This is a case of software catching up with technology and not the other way around.  DICE will eventually get it to work as it should.  I mean go and look at the Star Wars demo using the Unreal Engine with RTRT running with one RTX 2080.


----------



## londiste (Nov 18, 2018)

RT cores do not seem to be that expensive in terms of die space, 10% or a little below that compared to Volta and ~15% compared to Pascal.
Tensor cores and other Volta stuff is what takes up a lot of space on Turings.


----------



## Xzibit (Nov 18, 2018)

londiste said:


> RT cores do not seem to be that expensive in terms of die space, 10% or a little below that compared to Volta and ~15% compared to Pascal.
> Tensor cores and other Volta stuff is what takes up a lot of space on Turings.



Turing has a 1:1 (SM:RT) they need to make them better or increase 1:2+


----------



## VPII (Nov 18, 2018)

In all honesty as I stated before, I do not think the current RTX cards cannot handle real time ray tracing it is more a case of software not using the hardware properly and that will be ironed out going forward.


----------



## londiste (Nov 21, 2018)

https://www.eurogamer.net/articles/digitalfoundry-2018-battlefield-5-rtx-ray-tracing-analysis
Eurogamer's  Digital Foundry has a story on BF5-s ray-tracing including an interview with a DICE guy and some good technical details.


----------

