• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V with GeForce RTX DirectX Raytracing

Well I have one final thing to say since I am being shouted down for having an opinion...

You ALL should be praising the people who are buying these expensive cards for the ray tracing tech. It is THOSE PEOPLE who spend their hard earned money on what you call "a poor experience" or "not worth the value" that are the reason this new tech is developed AT ALL. The early adopters, as dumb as many of you seem to think they are, PAY FOR THE rediculuous up front R & D costs of this tech. If you consider yourself a tech enthusiast then you know ray tracing is the future. If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!
 
those that think people are buying the latest generation nvidia cards because they are dumb enough to fall for the ray tracing hype have it wrong.. most will be just like me and simply buying a simple upgrade to their gpu systems.. at this moment in time i dont give a f-ck about ray tracing.. i dont think i am on my own..

their only crime is they can afford to do it.. he he.

trog
 
Well I have one final thing to say since I am being shouted down for having an opinion...

You ALL should be praising the people who are buying these expensive cards for the ray tracing tech. It is THOSE PEOPLE who spend their hard earned money on what you call "a poor experience" or "not worth the value" that are the reason this new tech is developed AT ALL. The early adopters, as dumb as many of you seem to think they are, PAY FOR THE rediculuous up front R & D costs of this tech. If you consider yourself a tech enthusiast then you know ray tracing is the future. If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!
Or you could pay FAR LESS shareholders and executives, that's always the better option. Also accept far less profit.
Oh, I know, of course I know this goes against the core tenets of the capitalism religion, but I really don't care to be honest.
 
Well I have one final thing to say since I am being shouted down for having an opinion...

You ALL should be praising the people who are buying these expensive cards for the ray tracing tech. It is THOSE PEOPLE who spend their hard earned money on what you call "a poor experience" or "not worth the value" that are the reason this new tech is developed AT ALL. The early adopters, as dumb as many of you seem to think they are, PAY FOR THE rediculuous up front R & D costs of this tech. If you consider yourself a tech enthusiast then you know ray tracing is the future. If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!

That is a valid argument, IF you are convinced the path chosen to get RT going is the right one.

Right now, there is overwhelming evidence (data!) that proves it is a dead end. We are looking at massive dies and massive shortcomings. This cant scale as it is without severely handicapped performance.

So I dont have anyone to praise, quite the opposite, the message we send out by buying at these prices is that we will take anything for granted as gullible fools. And Nvidia knows this and capitalizes on it. Praise them for playing you instead.
 
I'm sure you are, you bought a 2080 after all. :rolleyes:

Your militant praise is staggering , after entire pages your still here replying to everyone with the same standard response, that RTX is great and we aren't enlightened enough to realize this astonishing feat that Nvidia have brought over to us.

Whatever floats your boat but you're wasting your time doing that, literally no one believes you. Not that your words surprise me, few would have the boldness required to admit that the rather expensive product that they bought isn't as stellar as they initially thought. Your determination to protect your purchase to the bitter end is admirable though.
Are you done with the personal insults and virtue signaling? Just FYI, I bought my 2080 AFTER all the reviews had been done and everyone's perspectives, assessments and opinions were declared. I made an informed choice. Wrap your head around that one.

Well I have one final thing to say since I am being shouted down for having an opinion...
Not everyone is shouting you down..

Or you could pay FAR LESS shareholders and executives, that's always the better option. Also accept far less profit.
Oh, I know, of course I know this goes against the core tenets of the capitalism religion, but I really don't care to be honest.
Or we bought a new card because it is the new hotness, does something completely new and performs better than the previous gen cards.. Hmmm..

Right now, there is overwhelming evidence (data!) that proves it is a dead end.
Citation required.
 
Citation required.

Is that your oneliner now? When you are asked for a source you tell ppl 'Google is your friend'. The evidence is right in front of you on this website, dummy.
 
Are you done with the personal insults and virtue signaling?

If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!

This is actual virtue signaling.
 
Is that your oneliner now? When you are asked for a source you tell ppl 'Google is your friend'. The evidence is right in front of you on this website
This very review shows RTRT is a viable thing. The PCWorld Youtube video showed it to be a viable thing. Regardless of opinions, everything shown so far about RTRT on RTX shows it is progressing well. Your comment said; "Right now, there is overwhelming evidence (data!) that proves it is a dead end." Just not seeing that. Prove up. Let's see your logic and make sure it doesn't contradict what is already known.
Enough with the insults or someone might give you some time-out.
This is actual virtue signaling.
You might want to refresh your understanding of what the phrase means.
 
This very review shows RTRT is a viable thing. The PCWorld Youtube video showed it to be a viable thing. Regardless of opinions, everything shown so far about RTRT on RTX shows it is progressing well. Your comment said; "Right now, there is overwhelming evidence (data!) that proves it is a dead end." Just not seeing that. Prove up. Let's see your logic and make sure it doesn't contradict what is already known.

Enough with the insults or someone might give you some time-out.

You might want to refresh your understanding of what the phrase means.

I'll leave it to you to connect the dots, since you've got that bigger picture and all

Dot one: https://www.techpowerup.com/247005/nvidia-geforce-rtx-2080-ti-tu102-die-size-revealed

Dot two:
1542308938845.png


Dot three: https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/4.html

Dot four: https://semiengineering.com/more-nodes-new-problems/

I'm sure you won't manage and resort to repeating everything you've said up till now. But try it anyway.

Key word and hint: SCALING.
 
Last edited:
I'll leave it to you to connect the dots, since you've got that bigger picture and all

Dot one: https://www.techpowerup.com/247005/nvidia-geforce-rtx-2080-ti-tu102-die-size-revealed

Dot two: View attachment 110619

Dot three: https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/4.html

I'm sure you won't manage and resort to repeating everything you've said up till now. But try it anyway.
And how do those points equate to "dead end"? Please spell it out for this particular dullard, because your logic isn't exactly flowing like a river..
 
And how do those points equate to "dead end"? Please spell it out for this particular dullard, because your logic isn't exactly flowing like a river..

No, no I won't. Ignorance is bliss and you're convinced its awesome.

I've added a fourth dot and a hint for you.
 
No, no I won't. Ignorance is bliss and you're convinced its awesome.

I've added a fourth dot and a hint for you.
Yeah, let's leave the engineering to the engineers. Scaling won't be a problem for long, it rarely is. Die/wafer yield is a common problem that happens with almost every new advancement in IC development. They keep coming up with solutions and we keep advancing.
 
Yeah, let's leave the engineering to the engineers. Scaling won't be a problem for long, it rarely is. Die/wafer yield is a common problem that happens with almost every new advancement in IC development. They keep coming up with solutions and we keep advancing.

OK. All is fine in the world!
 
I don't get it. Why is the performance taking a hit. Any hit whatsoever. Because those RT cores are either making all the other cores sit idle 2/3 of the time, and we need 3x more of them to make up for the lack of RT power or they are only doing part of the job for RTRT and are offloading the rest to the Cuda cores. This is not good.
Turns out that DLSS on 4K is actually 1440p with upscaling and they call that a 35% performance increase when it is actually a drop, because going from 4K to 1440p should yield 100% performance increase. It is not doing DLSS for free, offloading all the work to the tensor cores, but it is cheating. Anyways I'm using 2070 instead of 1080 with RTX OFF DLSS OFF for the time being lol.

That would only be a solution if you have a static amount of rays. The higher the rez the more rays. These are only reflection in only part of the screen/objects most of the time.

72 RT cores are scraping by only doing bare minimal 1spp @ 1080p just for the same effects you need x4 (288) RT cores at 4k to get by on BF5 reflections alone. If you want more detail on those reflections that number goes up quicker 2spp 576 RT cores just to improve detail in reflections which hardly will be noticeable. They'll need to highly increase the sample rate in order to get noticeable detail.

We have to see Tomb Raider and Metro to see how those effects take a toll on performance but if TechSpot is right these cards are way too under powered.
 
Last edited:
Is it entirely naive of me to think this might indeed be an "early PhysX" type of situation, as in the only solution that makes sense is a secondary card? Nvlink should allow this to be tacked on with minimal latency, and it might not even need its own VRAM, but you could then separate the (obviously die space and power consuming) RT cores to a separate die, with its own power delivery and cooling. Seems like a good solution, and it ought to make the non-RT cards cheaper too.
 
Is it entirely naive of me to think this might indeed be an "early PhysX" type of situation, as in the only solution that makes sense is a secondary card? Nvlink should allow this to be tacked on with minimal latency, and it might not even need its own VRAM, but you could then separate the (obviously die space and power consuming) RT cores to a separate die, with its own power delivery and cooling. Seems like a good solution, and it ought to make the non-RT cards cheaper too.

which begs the question.. do people really want ray tracing or do they want cheaper without it..

trog
 
a card dedicated to rtx might be a good idea. I could see a lot of people buying a dedicated rtx card to add visuals. thats if the whole die needs to be used to offer decent performance in 4k... either way, i'm sure they have a plan moving forwards... i'll be missing out on this years rtx effects. I shall look forward to the next generation.
theres no point trying to pick fault with each others logic... if you had the spare cash and love technology, not upgrading would of been very hard.

i'm still enjoying using the Titan V btw and bf V gives a smooth experiance at 4k... so im happy.
 
Color me old-fashioned, but I can't help but think RT in games to be good for anything but pretty screenshots.

What happened to the gamers that screamed "gameplay over graphics"? Are we going the way of the Dodo?
Taking a huge hit in gameplay fluidity for some pretty reflections that may or not be seen during action is silly, IMHO.

People with shiny stuff gonna look down their nose at people without tho. But don't stop to admire the scenery...
..you just might get fragged by some guy without shadows turned on.
 
Under-performing ray-tracing is NVidia's version of AMD's ROPS problem with Vega. The root cause, as is common in both issues, is scaling as called out by Vayra86. All these cards are at the edge of allowable power consumption (and thus were heavily locked down.) 7nm should alleviate that somewhat if complexity in die-size doesn't stop it first, but the shrink itself presents a new issue of heat density (old in terms of Vega needing water cooling.)
 
Color me old-fashioned, but I can't help but think RT in games to be good for anything but pretty screenshots.

What happened to the gamers that screamed "gameplay over graphics"? Are we going the way of the Dodo?
Taking a huge hit in gameplay fluidity for some pretty reflections that may or not be seen during action is silly, IMHO.

People with shiny stuff gonna look down their nose at people without tho. But don't stop to admire the scenery...
..you just might get fragged by some guy without shadows turned on.
Well, better graphics can improve immersion. Not likely in a fast-paced shooter, but there are plenty of games where more realistic graphics would directly improve the player's perception of gameplay through improved immersion. I'm betting Metro is going to make quite good use of this.
 
Under-performing ray-tracing is NVidia's version of AMD's ROPS problem with Vega. The root cause, as is common in both issues, is scaling as called out by Vayra86. All these cards are at the edge of allowable power consumption (and thus were heavily locked down.) 7nm should alleviate that somewhat if complexity in die-size doesn't stop it first, but the shrink itself presents a new issue of heat density (old in terms of Vega needing water cooling.)

Well there is whispers on "the internet" that the power consumption is actually lowered with RTX on. Which means that cuda shaders are underutilized on rastering task, while gpu uses RT cores. So could you @W1zzard check the RTX card power numbers with RTX on vs RTX off?
 
Well there is whispers on "the internet" that the power consumption is actually lowered with RTX on. Which means that cuda shaders are underutilized on rastering task, while gpu uses RT cores. So could you @W1zzard check the RTX card power numbers with RTX on vs RTX off?
Not only power, but also clock speeds. Given that Geforce GPU reviews very often focus on clock speed measurements due to GPU Boost, it really shouldn't be too hard to bring this to the table here too. Are the cuda cores idle but running at similar boost as without RTX on? Clocked low, but still not fully utilized? Power throttling? A combination of all three over time?
 
This is an absolute nightmare!! :( :(
I'm hoping it's all a mistake and those extra RTX cores that I paid (double what the card is worth) for are somehow not turned on. I expected a performance gain now it's come to light that these new feature cause a performance loss.
COMPLETELY UNACCEPTABLE!
No wonder they didn't show real game performance at launch.
I can't believe Nvidia did this to us.
 
This is an absolute nightmare!! :(:(
I'm hoping it's all a mistake and those extra RTX cores that I paid (double what the card is worth) for are somehow not turned on. I expected a performance gain now it's come to light that these new feature cause a performance loss.
COMPLETELY UNACCEPTABLE!
No wonder they didn't show real game performance at launch.
I can't believe Nvidia did this to us.
Wait, what? It's been clear from long before cards were available that RTX would entail a performance loss. Some of the first commentary on Nvidia's promo material - across plenty of web sites - was "this is choppy, clearly below 60fps". Did you not look up any information whatsoever before buying?

I suppose there is a "performance gain" compared to trying real-time ray tracing without RT cores (which would likely get you <0.01FPS), but ... yeah, nobody's trying that.
 
Back
Top