# Battlefield V with GeForce RTX DirectX Raytracing



## W1zzard (Nov 14, 2018)

Today, Battlefield V enabled support for DirectX Raytracing with the latest patch. We tested this update using the RTX 2080 Ti, RTX 2080, and RTX 2070, with shocking results. Even though only reflections are raytraced, the performance hit for DXR is more than 50%.

*Show full review*


----------



## TheGuruStud (Nov 14, 2018)

Totally worth wasting all that die space! If only people were warned XD


----------



## Aldain (Nov 14, 2018)

As expected , utter shat..


----------



## unikin (Nov 14, 2018)

OMFG! This is even worse, than I thought. RIP Ray tracing, see you resurrected in a decade.


----------



## Vayra86 (Nov 14, 2018)

I must say, if you discount the performance hit, it does look nice and I am positively surprised by the resolution/quality of these reflections. Unfortunately these reflections are completely useless in a fast paced shooter, ain't nobody got time for that. Reflections have always been super costly in terms of performance, even the rasterized implementations of it. Not as costly as this though, but OK.

And as pointed out in the conclusion this is only reflective surfaces, which isn't all that interesting. Dynamic lighting is where its at for realism that will truly be different.

In any case, high production quality is still king and DX11 RTX OFF has that in spades. Its definitely too early for this.



But then, I get reminded of this, 1999 called and wants its glory back...

Yep. Now consider this ran at 300+ FPS on ancient systems, too.









TheGuruStud said:


> Totally worth wasting all that die space! If only people were warned XD



Hey, I tried my best...


----------



## Dante Uchiha (Nov 14, 2018)

CPU usage changed with RTX ON?


----------



## R0H1T (Nov 14, 2018)

You see, this is why we never had RTX launch with RT RT in games. Now if we ignore the performance hit, which is really hard to ignore, there's still some ways to go before RT RT becomes mainstream. Which is to say that an overhyped feature fell flat on its face, it did however alter the price of GPUs & quite possibly forever.


----------



## lexluthermiester (Nov 14, 2018)

Vayra86 said:


> I must say, if you discount the performance hit, it does look nice.


That's what I was thinking. I think it looks great. 


Vayra86 said:


> Its definitely too early for this.


Gonna have to disagree. I think this is just the right time. This game, in spite of it's problems, looks beautiful with RTRT enabled.


One thing that should be noted is that W1zard is known for cranking up the AA. Turn AA down or off and the FPS will go up. This almost tempts me to make an Origin account.. Almost..



R0H1T said:


> Which is to say that an overhyped feature fell flat on its face


Did you actually *read* the review?


----------



## Aldain (Nov 14, 2018)

lexluthermiester said:


> That's what I was thinking. I think it looks great.
> 
> Gonna have to disagree. I think this is just the right time. This game, in spite of it's problems, looks beautiful with RTRT enabled.
> 
> ...



stop F meat-shielding... give it a rest.. it is shat


----------



## Vayra86 (Nov 14, 2018)

lexluthermiester said:


> Gonna have to disagree. I think this is just the right time. This game, in spite of it's problems, looks beautiful with RTRT enabled.



I know, you're an optimist  I'm a realist. Game also looks beautiful without it.

AA or no AA, you can look at 1080p for those results, it needs AA to look OK. Or are you saying 'let's turn off AA on 1080p so I can look at a glass pane'


----------



## unikin (Nov 14, 2018)

If I'd pay $1.300 for GPU, I'd expect flawless RTX gaming at 4K not a slideshow. Maybe RTX 4080TI can deliver.


----------



## xkm1948 (Nov 14, 2018)

Haters gonna hate. Just like when people discounted hardware T&L, or pixel/vertex shading and etc. in a few years they will all shut up.

TBH first gen RTX was never meant to make RT RT playable for 4k 60fps. The Turing design is hybrid in nature. Hardware needs to wait for software to catch-up.  The following RTX will be more and more capable in RT RT until it becomes main stream.


----------



## mouacyk (Nov 14, 2018)

No, you cannot have your $650 back.


----------



## Dammeron (Nov 14, 2018)

Check out the last 2 screens showing on page 2: how is it possible that You can see a reflection of a building that is behind the train cart perpendicular to it's windows?

Old Unreal on Glide/D3D had better reflections, and that was 1998!


----------



## PerfectWave (Nov 14, 2018)

why at guru3d vega 64 stay ahead of 2070 and here is behind?


----------



## unikin (Nov 14, 2018)

xkm1948 said:


> Haters gonna hate. Just like when people discounted hardware T&L, or pixel/vertex shading and etc. in a few years they will all shut up.
> 
> TBH first gen RTX was never meant to make RT RT playable for 4k 60fps. The Turing design is hybrid in nature. Hardware needs to wait for software to catch-up.  The following RTX will be more and more capable in RT RT until it becomes main stream.



I have no quarrel with RTX 2080TI performance, but selling it for $500 more on RTX gimmick is a BIG NO NO. If ray tracing worked, I'd say it's expensive but OK, you get next gen experience out of it. As it is you pay for Porsche and get Ford.  Thanks but no thanks, I'm not willing to be Nvidia's milking cow and get nothing.


----------



## INSTG8R (Nov 14, 2018)

xkm1948 said:


> Haters gonna hate. Just like when people discounted hardware T&L, or pixel/vertex shading and etc. in a few years they will all shut up.
> 
> TBH first gen RTX was never meant to make RT RT playable for 4k 60fps. The Turing design is hybrid in nature. Hardware needs to wait for software to catch-up.  The following RTX will be more and more capable in RT RT until it becomes main stream.


Again this is a new standard DX12 feature this IS the software. It’s the hardware that needs to catch up just like T&L and pixel/vertex shading etc. It’s always been the hardware that needs to be improved/upgraded to use the software/features never the other way around. Why everyone seems to be thinking the other way around is getting perplexing.


----------



## R0H1T (Nov 14, 2018)

xkm1948 said:


> *Haters gonna hate*. Just like when people discounted hardware T&L, or pixel/vertex shading and etc. in a few years they will all shut up.
> 
> TBH first gen RTX was never meant to make RT RT playable for 4k 60fps. The Turing design is hybrid in nature. Hardware needs to wait for software to catch-up.  The following RTX will be more and more capable in RT RT *until it becomes main stream*.


Yes, everyone's a hater.

We've got 2 or 3 full node shrinks after this, probably only 2 if you're AMD. How long do you think it'll take for RT RT to materialize into a less computationally intensive dream? If it's 5 years, with just a node shrink left, then I feel it's too late. RT RT also needs some other major (software) breakthrough for it to become a gaming reality.

By the time RT RT becomes mainstream it could well be that GPUs morph into something else, perhaps something more than what they offer today.


----------



## Robcostyle (Nov 14, 2018)

Well, nuff 've been said already.

I wonder, what an excuse woudl mr jensen find for all those ridiculous prices? Especially for 2070 - the most useless and pathetic card for its price - in history!
P.S I suggest they'll hurry-up some kind of patch, in panic, just in order to make 2070 sustain 60 FPS at least in 1080p, at least with DXR low - and they will look even worse than before. That's gonna be funny to watch at.


----------



## XiGMAKiD (Nov 14, 2018)

The Ultra DXR setting of this game is weird, on some card and resolution combo it's faster than High and on another it's faster than Medium. Why not ditch Medium and High altogether if the difference is negligible (~1-3fps)? Also where's the screenshot with DXR off to see how much visual fidelity do we lose when turning off DXR?


----------



## SIGSEGV (Nov 14, 2018)

I get it now after reading this excellent review on why AMD is not in hurry to catch up this sorry to say.. kinda problematic feature. 
https://www.techpowerup.com/249465/...eventually-be-an-answer-to-directx-raytracing

great


----------



## qubit (Nov 14, 2018)

Well, the $1200 2080 Ti can at least clear 65fps at 1080p, so the game will still run nice and smooth for a 60Hz vsync lock for the most part.

Driver and game optimisations might improve on this somewhat, but I still think it's worth waiting for the next gen card if RTX is your primary feature. I don't think $1200 for this performance is quite good enough though, hence the wait recommendation.


----------



## Vayra86 (Nov 14, 2018)

R0H1T said:


> Yes, everyone's a hater.
> 
> We've got 2 or 3 full node shrinks after this, probably only 2 if you're AMD. How long do you think it'll take for RT RT to materialize into a less computationally intensive dream? If it's 5 years, with just a node shrink left, then I feel it's too late. RT RT also needs some other major (software) breakthrough for it to become a gaming reality.
> 
> By the time RT RT becomes mainstream it could well be that GPUs morph into something else, perhaps something more than what they offer today.



There is only one way out for dedicated RT subsystems and that is chiplet based designs, and those are already coming out or are planned (Zen, Navi). With all the latency concerns that brings, but I don't see it going further as dedicated on-die with the GPU. 2080ti is already massive and falls short, and we won't accept absolute performance stopping right there either.


----------



## Deleted member 158293 (Nov 14, 2018)

This is more than just a swing & a miss.  This is a triple out.  Hardware and software is nowhere near production ready. 

Will revisit RT in 2 or 3 generations of products.  Although...  AMD's chiplet design will probably be a near perfect fit for this,  so we may have a useful RT solution sooner.


----------



## ZeroFM (Nov 14, 2018)

All cards reference ? . nvidia 2*** series is factory overclocked from nvidia  ( not fair comparision vs older generation cards )


----------



## lexluthermiester (Nov 14, 2018)

Vayra86 said:


> Game also looks beautiful without it.


Not quite as beautiful.


Vayra86 said:


> Or are you saying 'let's turn off AA on 1080p so I can look at a glass pane'


No, I'm saying let's turn it off because it's not needed. But I also game at 1440p which really doesn't need AA. Pixel density is just to high to justify the need for it.

I haven't used AA in games in almost a decade.


----------



## Gasaraki (Nov 14, 2018)

"DirectX 12, a rendering mode that experiences some stuttering in Battlefield 5, something that was present even in BF1, and that the developer has been unable to completely fix." 

Why bother DICE? If you can't do it properly don't do it at all.


----------



## Steevo (Nov 14, 2018)

Uhhh ohh, those reflections are wrong, very wrong, perhaps the timing of calculating Ray's is taking place before the scene is fully rendered to mask the latency. 

Perhaps W1zz could record some video that we could look at, both RTX on and off to see if other areas are affected. 


Dammeron said:


> Check out the last 2 screens showing on page 2: how is it possible that You can see a reflection of a building that is behind the train cart perpendicular to it's windows?
> 
> Old Unreal on Glide/D3D had better reflections, and that was 1998!


----------



## cellar door (Nov 14, 2018)

@W1zzard - what about the power consumption? What happens to the clocks when you all of a sudden engage the RTX cores on the card.


----------



## z1tu (Nov 14, 2018)

Dammeron said:


> Check out the last 2 screens showing on page 2: how is it possible that You can see a reflection of a building that is behind the train cart perpendicular to it's windows?
> 
> Old Unreal on Glide/D3D had better reflections, and that was 1998!



I think they're trying to reflect the other building but still seems kind of wrong.


----------



## Robcostyle (Nov 14, 2018)

Wait. 4real, those RTX should be rendered by dedicated cores. So why frame drop? It looks like overall chip's fault. 
I mean, increased latencty, framtime, freezes because RT cores are way back from cuda's - yeah.
Seems like thos RT cores are integrated in the whole structure even more than I thought. 

IMHO, Anyway, I've seen enough tests for Nvidia RT via other resources -  results are even worse - and I'm not buying it. The whole pack of cards is just rubbish, even 2080 ti doesn't fully serves to my needs. It is sillt decent 60 fps in 1440p, but hey - I have 144hz screen, and this card? For 1500$? ROFL


----------



## rtwjunkie (Nov 14, 2018)

lexluthermiester said:


> Not quite as beautiful.
> 
> No, I'm saying let's turn it off because it's not needed. But I also game at 1440p which really doesn't need AA. Pixel density is just to high to justify the need for it.
> 
> I haven't used AA in games in almost a decade.


Except you really seem to have missed the dismal FPS on 1440 with RTX on.  This necessitates the playing at 1080p, where AA is actually needed.  So there you go, you would have to use it for the first time in years.


----------



## EarthDog (Nov 14, 2018)

@W1zzard 

Did I also miss in this performance review, how exactly this was benchmarked (process/scene)? I asked a couple days ago in the other thread, but didn't receive an answer there either.

Any information would be helpful. You may also want to consider listing this on all performance reviews if they do not have an integrated benchmark. Finding settings and methods here are incredibly difficult (for me - anyone else?).


----------



## 荷兰大母猪 (Nov 14, 2018)

Currently I have RTX 2080 Ti SLI now but I found that when I enable DX12, this game won’t support SLI. I want to know is there any chance that in the future this game will be SLI supported when enable DX12?


----------



## unikin (Nov 14, 2018)

We need GPUs that can deliver much higher framerates at 4K if we want to see progress in VR. Eye tracking and foveated rendering will take few more years to implement so we need NVidia and AMD to fill the gap in between by increasing rasterizing  performance of GPUs. Moving to 7nm dies and doubling stream Processors/cuda cores would do the trick, We would get 50-60 % more performance compared to todays GPUs. Combined with good reprojection probably enough to drive 4K per eye resolution at comfortable refresh rates, maybe even higher. I'd gladly pay $1.300 for such GPU.


----------



## INSTG8R (Nov 14, 2018)

荷兰大母猪 said:


> Currently I have RTX 2080 Ti SLI now but I found that when I enable DX12, this game won’t support SLI. I want to know is there any chance that in the future this game will be SLI supported when enable DX12?


It’s apparently coming but I only know that as rumour. Apparently you can use the BF1 Profile.


----------



## Prince Valiant (Nov 14, 2018)

qubit said:


> Well, the $1200 2080 Ti can at least clear 65fps at 1080p, so the game will still run nice and smooth for a 60Hz vsync lock for the most part.
> 
> Driver and game optimisations might improve on this somewhat, but I still think it's worth waiting for the next gen card if RTX is your primary feature. I don't think $1200 for this performance is quite good enough though, hence the wait recommendation.


Doubtful it'll make enough of a difference to push it into acceptable performance. Looks alright on ultra if you ignore the errors (noise?) but the performance hit is nuts.


----------



## MAXLD (Nov 14, 2018)

Ironic, over the years they spent countless time to make game graphics / details look "less perfect" / "more realistic" with the use (and obnoxious abuse) of things like:
- Blur
- Lens flare
- Chromatic aberration
- Depth of field
- all kinds of fog
etc...
Now, behold, RTX / RT. The cards / technology that makes the typical C.S.I. TV show's reflection "enhancements" look like stone age stuff. Everything strangely reflects perfectly like a mirror (even in the middle of a warzone).
"And it could be yours, for just 59.99% less performance, call now".

Tbh, all this is not just something to make these RTX 2000 cards look appealing... it's more something to make the 2000 series look like absolute crap when the 3000 arrives and stumps that shameful performance to the curb.
"- The brand new RTX 3000 series, now "up to" double the RT performance! (aka, actually only with 40~50% fps penalty instead of 50~60)
- Oooooh, sht, pre-order!"

Y, sure. The tech is not ready to be viable on current hardware, and considering AMD can't put up a fight on the top tier for the time being, they made another "2 in 1" strategy move (kind of a "praise now to slam later"):
- 1st: try sell the 2000 cards, marketing the RTX as the revolutionary bandwagon you have to be in right now, and you're missing out if you don't buy one
- 2nd: make you look like a tool, when they come up with the new 3000 one that inevitably improves RT performance to a certain degree... one that tries to make you feel like it's enough to justify the swap.
In the end, the performance hit is still nowhere near worth the actual visual benefits, but hey... new card... look at it, it's better... gotta buy that now.

I'm all up for new tech advancements, obviously, and RT will be the standard one day, but there's still a long way to go. nVidia just wanted to brag about it / cash in (very) early. Which is not unexpected, though.


----------



## TheGuruStud (Nov 14, 2018)

Dammeron said:


> Check out the last 2 screens showing on page 2: how is it possible that You can see a reflection of a building that is behind the train cart perpendicular to it's windows?
> 
> Old Unreal on Glide/D3D had better reflections, and that was 1998!



Opengl, especially. But then microlosers killed it. Direct X looked like you smeared shit on window and tried looking through it (check out the vids on youtube, I remember it vividly from playing).


----------



## mouacyk (Nov 14, 2018)

Dammeron said:


> Check out the last 2 screens showing on page 2: how is it possible that You can see a reflection of a building that is behind the train cart perpendicular to it's windows?
> 
> Old Unreal on Glide/D3D had better reflections, and that was 1998!


Because they weren't doing tricks in 1998 -- it was a completely different render pass, once the view matrix was transformed to align with the reflection.

And in 1996, Duke actually had a clone of himself and the entire bathroom behind the "mirror".  The clone moved with a slight lagg.


----------



## John Naylor (Nov 14, 2018)

It would seem the comments pretty much line up with whatever preconceived notions existed before it came out.  Could easily have said the same about 4k as even a 1080 Ti struggles to maintain fps that justify having a $2500 144 hz IPS screen .... just not there yet.   OTOH, it's a very good time with respect to this.  

I just picked Witcher 3 as an example, I'm sure other games will do worse and some other games will do better. But Vega 64 is only 48% as fast as the 2080 Ti.  So... in other words....the best nVidia has to offer ***with everything RTX turned on*** is going to be about as fast as the best that AMD can offer... so from that standpoint, it's genius.   Sure, the Devs have a long way to go in implementing their game code, nVidia has a long way to go in tweaking their drivers but as with any "new thang" this will occur over time. If there's no card to display it, then there's little incentive to code it into the game and no progress will be made.

Of course there's the price issue ...the 2080s are now the same $700 price that top tier cards have averaged since the year 2000.  We know that the stock of 10xx cards is still significant so that price premium over the typical xx80 will remain at least till that is gone and production lines for 20xx cards can begin to meet demand.  We see this every new generation, albeit to lesser extent because of unique conditions now (0 competition in top 4 tiers, overabundance of last gen cards in pipeline and , usual low initial yields).  We also have the fact that the Ti usually doesn't arrive until 6 months or more after the xx80; all nVidia is doing here, is why don't we get some cash for these cards that pass while we are tweaking the line to improve yields ?  If you are too impatient to wait until price stabilize, there's only 1 person to blame.

I won't speak to what makes sense today for the reason's above ...  but I do expect that prices will drop to close to  "usual" level, adjusted for inflation and yes, it does have extra capability.  I will consider the 2080s a recommended buy when they hit $650 / $680-$690 for the AIBs.   The Ti is going to have to get to $750 / $780-$799 before I can recommend it... which i don't expect will happen till May.

So given that the 2080 Ti is more than twice as fast as anything from AMD, other than price, what's there to whine **if you turn on RTX and get the same performance as a Vega 64 ??    Ya have a game where your system will be below 60 fps, leave it off,  your twice as fast as Vega 64 ... you have a game at 140 fps, turn it ON and enjoy a better visual experience at the same speed as Vega 64.   Sure the 2080 Ti is ridiculously priced ... why ?.... because they can.  People are buying them up faster than hey can make them.    The 2080 has MSRP of $700, normally we see AIB cards for $20-$40 more but as long as they sell faster than they can be built, vendors will of course take what they can get.

My B-I-L has a food truck he brings to county fairs and stuff.   Stocking up, he fills up all his cabinets and  refrigeration with food, selling his sammies for $5 each ... he runs outta food by 1 pm.    Next time he tries $6 and goes home at 3 pm.   Next time he tries $7 and he runs out 15 minutes before closing time.... So he charges $7 and will continue to do so until he finds food left over at the end of the day.  Don't blame nVidia for what they are required to do by law... maximize the return to stockholders; blame the people who are causing new stock to be sold within 48 hours after arrival.  When yields improve, and the "I gotta be 1st on my block to have one" / "Money doesn't matter" crowd's demand is filled, prices will drop.... as they always have.

The whining here is the same as we saw witht PhyX ... "not all games support it" ... so what ?   I don't turn my AC on in the car most of the time, but it's damn nice on those hot summer days and it's needed.  Who walks into a car dealership and says "So, you're saying that the car I want, with AC, is the exact same price as the one without it.... is that right ?   Well that's stoopid, I only use the AC maybe 120 days a year or 33% of the time, so why would I want a feature I can't use 100% of the time ?


----------



## Easy Rhino (Nov 14, 2018)

This is amazing tech. Haters gonna hate.


----------



## B-Real (Nov 14, 2018)

qubit said:


> Well, the $1200 2080 Ti can at least clear 65fps at 1080p, so the game will still run nice and smooth for a 60Hz vsync lock for the most part.
> 
> Driver and game optimisations might improve on this somewhat, but I still think it's worth waiting for the next gen card if RTX is your primary feature. I don't think $1200 for this performance is quite good enough though, hence the wait recommendation.


"at least" 60 fps in FHD for a 1200$ card in a game that is one of the most optimized ones on the market.  What a joke this is.



John Naylor said:


> It would seem the comments pretty much line up with whatever preconceived n0tions existed before it came out.  Could easily have said the same about 4k as even a 1080 Ti struggles to maintain fps that justify having a $2500 144 hz IPS screen .... just not there yet.   OTOH, it's a very good time with respect to this.
> 
> I just picked Witcher 3 as an example, I'm sure other games will do worse and some other games will do better. But Vega 64 is only 48% as fast as the 2080 Ti.  So... in other words....the best nVidia has to offer ***with everything RTX turned on*** is going to be about as fast as the best that AMD can offer... so from that standpoint, it's genius.   Sure, the Devs have a long way to go in implemeneting their game code, nVidia has a long way to go in tweaking their drivers but as with any "new thang" this will occur over time. If there;s no card to display it, then there's little incentive to code it into the game and no progress will be made.
> 
> ...


Nope. This is not genius even from the performance standpoint.



Easy Rhino said:


> This is amazing tech. Haters gonna hate.



Yes, for sure.


----------



## R0H1T (Nov 14, 2018)

John Naylor said:


> It would seem the comments pretty much line up with whatever preconceived n0tions existed before it came out.  Could easily have said the same about 4k as even a 1080 Ti struggles to maintain fps that justify having a $2500 144 hz IPS screen .... just not there yet.   OTOH, it's a very good time with respect to this.
> 
> I just picked Witcher 3 as an example, I'm sure other games will do worse and some other games will do better. But Vega 64 is only 48% as fast as the 2080 Ti.  So... in other words....the best nVidia has to offer ***with everything RTX turned on*** is going to be about as fast as the best that AMD can offer... so from that standpoint, it's genius.   Sure, the Devs have a long way to go in implemeneting their game code, nVidia has a long way to go in tweaking their drivers but as with any "new thang" this will occur over time. If there;s no card to display it, then there's little incentive to code it into the game and no progress will be made.
> 
> ...


I don't think it'll happen, ever. This is why I'd also say that the RTX cards may have changed the GPU landscape forever, at least price wise.


----------



## Easy Rhino (Nov 14, 2018)

people really don't understand how computationally expensive ray tracing is...


----------



## Franzen4Real (Nov 14, 2018)

@W1zzard What's going on with the 2080ti med/high/ultra fps? They are all within a couple frames difference at any res. If we were viewing conventionally rendered games, we would see this as a CPU bottleneck starting at medium settings.


----------



## INSTG8R (Nov 14, 2018)

Easy Rhino said:


> people really don't understand how computationally expensive ray tracing is...


Of course but Turing isn’t up to it in any meaningful way. Hopefully Turing 2 will be able double the performance marking usable feature while maintaining better performance. I’m curious when they allow SLI  if they’ll allow DXR on both cards as well and we might see something close to reasonable performance.


----------



## Easy Rhino (Nov 14, 2018)

INSTG8R said:


> Of course but Turing isn’t up to it in any meaningful way. Hopefully Turing 2 will be able double the performance marking usable feature while maintaining better performance. I’m curious when they allow SLI  if they’ll allow DXR on both cards as well and we might see something close to reasonable performance.



Seems okay to me. If you think it is too expensive given the performance then the card clearly isn't for you.


----------



## INSTG8R (Nov 14, 2018)

Easy Rhino said:


> Seems okay to me. If you think it is too expensive given the performance then the card clearly isn't for you.


Not at all about price it’s about diminishing returns 2080Ti performance becomes 1060 performance just to achieve it. it’s not for anyone unless they like dumbing down there Resolution and FPS on a Flaghship card that can easily do 4K
Would you play at 1024 or 720 on your 970 just for this feature?


----------



## mouacyk (Nov 14, 2018)

Easy Rhino said:


> Seems okay to me. If you think it is too expensive given the performance then the card clearly isn't for you.


 Right.  So where's the card with +50% raster performance over 1080 TI and costs $650?  4K@120Hz are out and will need every ounce of raster performance it can muster.  Call it a stretch, but NVidia delayed BFGD in order to push RTX (out of nowhere.)


----------



## Easy Rhino (Nov 14, 2018)

INSTG8R said:


> Not at all about price it’s about diminishing returns 2080Ti performance becomes 1060 performance just to achieve it. it’s not for anyone unless they like dumbing down there Resolution and FPS on a Flaghship card that can easily do 4K
> Would you play at 1024 or 720 on your 970 just for this feature?



What I would do really doesn't matter. Obviously people are willing to pay for it. The card isn't made for everyone. If you don't think it is a good value then it isn't FOR you. It is pretty amazing though it can play 4K with ray tracing over 30 FPS.


----------



## stimpy88 (Nov 14, 2018)

Gotta have love for anyone fooled into buying anything less than the 2080Ti.  A totally unplayable 4K experience for anything lower in the range, and let’s not forget, this is just a handful of reflections in windows and some other fairly minor stuff, not a full or complex Raytraced game.

Gotta be the biggest waste of money ever in the premium PC market in many years. And the muppets fell for it, and are all up in the forums across the tech-web chest beating on behalf of the mighty nGreedia, and telling everyone how we don’t understand how visionary nGreedia are, and that this is just our expectations being too high, and that we are the ones with an understanding problem...  yeah right, face it, you got ripped off and sold a lie, or more accurately, nGreedias lack of concrete performance info, and the very convenient lack of RTX software for the sheep to go and buy those cards like kids in a candy shop, with a deep slurp of green Kool-Aid on the side.

RTX tech is important, and Raytracing is the future, but not for another 2 or 3 generations, or 5 to 10 years in real speak.

But now we know why there were no tech demos released to the public, so we couldn’t get a peek under the hood...  nGreedia couldn’t take the risk of putting off all those thousands of gullible rich kids lacking in patience and intelligence, maybe throwing yet more money on an SLI setup will be their saving grace, and let them boast even more in the forums...  Win win for nGreedia, and a loss for anyone sane lamenting the fall of the PC platform to a niche gullible rich kids hobby.


----------



## kastriot (Nov 14, 2018)

1080Ti  2080Ti


----------



## Jism (Nov 14, 2018)

Nvidia is like Apple.


----------



## trog100 (Nov 14, 2018)

i personally never bought my 2080ti for its f-cking ray tracing abilities.. i never expected them to be of much use anyways.. i bought it because its the best there is and i could afford the price..

but its quite clear most of the negative comments in this thread are motivated simply by out and out hardware envy.. tis a shame its so prevalent..

trog


----------



## INSTG8R (Nov 14, 2018)

Easy Rhino said:


> What I would do really doesn't matter. Obviously people are willing to pay for it. The card isn't made for everyone. If you don't think it is a good value then it isn't FOR you. It is pretty amazing though it can play 4K with ray tracing over 30 FPS.


So you’d be happy paying $1200 for a card that runs as well as a $300 card for a “feature” I’m not sure if your serious or just being obtuse. INOBODY wants to play an FPS at 30FPS, Now you’re really reaching. The card is already in a new realm of pricing this “feature” now that it’s exposed adds no value to it at all. I’m sure sales will double with these new benchmarks. Please you’re having a laugh


----------



## stimpy88 (Nov 14, 2018)

trog100 said:


> i personally never bought my 2080ti for its f-cking ray tracing abilities.. i never expected them to be of much use anyways.. i bought it because its the best there is and i could afford the price..
> 
> but its quite clear most of the negative comments in this thread are motivated simply by out and out hardware envy.. tis a shame its so prevalent..
> 
> trog


Nice boast you got in there bro...  shrug it off, and blame the plebs for not being able to afford it.  Make you look like da MAN!

I wish I could be you...  Ah hang on, then I’d look like a trigger-happy thick idiot with an e-peen prematurely blasting its load on anything shiny kind of guy...  Seriously dude, not a good look.


----------



## INSTG8R (Nov 14, 2018)

trog100 said:


> i personally never bought my 2080ti for its f-cking ray tracing abilities.. i never expected them to be of much use anyways.. i bought it because its the best there is and i could afford the price..
> 
> but its quite clear most of the negative comments in this thread are motivated simply by out and out hardware envy.. tis a shame its so prevalent..
> 
> trog


I’m pretty happy with my Vega I’m gaming at 1440 144hz with  Freesync and HDR when available. Exactly where I want to be.


----------



## Prince Valiant (Nov 14, 2018)

I'm not understanding why people are trying to pass negative views off as jealousy or opinion. Performance is poor and the numbers back it up.


----------



## stimpy88 (Nov 14, 2018)

Prince Valiant said:


> I'm not understanding why people are trying to pass negative views off as jealousy or opinion. Performance is poor and the numbers back it up.


You do realise these people would rather take it out on the plebs than actually try to think and realise they were taken from behind by nGreedia.  I actually think people like that kind of enjoy it...  and thinking is not these people’s forté.


----------



## Vayra86 (Nov 14, 2018)

Easy Rhino said:


> What I would do really doesn't matter. Obviously people are willing to pay for it. The card isn't made for everyone. If you don't think it is a good value then it isn't FOR you. It is pretty amazing though it can play 4K with ray tracing over 30 FPS.



And huge stuttering, and only on the lowest detail setting, but yes, if you set the bar low enough, its awesome!

'Obviously people are willing to pay for it'

Do they have a choice if they want to upgrade from a 1080 or better?

Come on man. I remember a time when we had a frame pacing shitstorm because frame times would suddenly exceed 50ms. Here we are looking at five times the exceptions and you're advocating its great performance. Get real.



trog100 said:


> i personally never bought my 2080ti for its f-cking ray tracing abilities.. i never expected them to be of much use anyways.. i bought it because its the best there is and i could afford the price..
> 
> but its quite clear most of the negative comments in this thread are motivated simply by out and out hardware envy.. tis a shame its so prevalent..
> 
> trog



Don't let that 2080ti get to your head. Its just a GPU that is pretty much obsolete by next gen, don't worry. I'll spend that 1200 on nicer things, no problem at all.

'Hardware envy' lol... I literally have no desire whatsoever to buy RTX, I can go buy 2080ti SLI without issues any time of the month, but I'm not blinded by the idiocy of having to have the fastest gear at every point in time. These GPUs are nothing to get envious about either, they are inefficient updates to Pascal, there is no beauty here, just a blunt weapon swinging at reflections in a crude way on a questionable process.



R0H1T said:


> I don't think it'll happen, ever. This is why I'd also say that the RTX cards may have changed the GPU landscape forever, at least price wise.



I think that is accurate. If the metric remains to be perf/dollar based on FPS, then it will surely have changed. The die has to be bigger, there is no way around it.

The real question is, how far can they optimize it, and how much will the market take. The problems people ran into with mining in terms of pricing are telling, there isn't this much left to be stretched in terms of price. People resorted to 1060's for the most part, while close to Pascal launch, everyone jumped on 1070s.


----------



## Voluman (Nov 14, 2018)

Thanks for reviewing these, now lets see another title, how will going.
Well its not so bad for a start, but those expectations and hypetrain...where can join to nv marketing team?


----------



## mouacyk (Nov 14, 2018)

trog100 said:


> i personally never bought my 2080ti for its f-cking ray tracing abilities.. i never expected them to be of much use anyways.. i bought it because its the best there is and i could afford the price..
> 
> but its quite clear most of the negative comments in this thread are motivated simply by out and out hardware envy.. tis a shame its so prevalent..
> 
> trog


Nothing against you.  Even you can appreciate it if NVidia sold another SKU or just plain left out the RTX crap and actually improved the raster performance by 50% instead of 25% at most.  Tough -- they happen to have a special market position right now.  But, we can certainly express our discontent with their sleight-of-hand, right?


----------



## stimpy88 (Nov 14, 2018)

mouacyk said:


> Nothing against you.  Even you can appreciate it if NVidia sold another SKU or just plain left out the RTX crap and actually improved the raster performance by 50% instead of 25% at most.  Tough -- they happen to have a special market position right now.  But, we can certainly express our discontent with their sleight-of-hand, right?


Don’t expect an adult conversation with someone like that.  You will never win, and he has done a fantastic job of proving just how capable he is at thinking.


----------



## iO (Nov 14, 2018)

Wow that's bad. Now imagine the performance if some dev tries to implement more RT stuff like shadows , AO or global illumination and not just shiny reflections...


----------



## Robcostyle (Nov 14, 2018)

Easy Rhino said:


> What I would do really doesn't matter. Obviously people are willing to pay for it. The card isn't made for everyone. If you don't think it is a good value then it isn't FOR you. It is pretty amazing though it can play 4K with ray tracing over 30 FPS.



That sounds like a lowgrade troll. Not very good-looking from someone with *Staff member* label.


----------



## unikin (Nov 14, 2018)

Did you see Turing Quadro presentation? "Buy as many GPUs you can afford to save more" coming from Jensen's mouth numerus times during presentation.  He even urged public to repeat it. I'd even understand this behavior if he was addressing gaming market but he was addressing Pros. Some people just walked out and rightfully so. WTF is wrong with NVidia? Have they completely lost it? Can't wait for Intel and Chinese to enter GPU market. Such ridiculous presentation should not be tolerated by consumers be it big data centers, designers or gamers.


----------



## Vayra86 (Nov 14, 2018)

unikin said:


> Did you see Turing Quadro presentation? "Buy as many GPUs you can afford to save more" coming from Jensen's mouth numerus times during presentation.  He even urged public to repeat it. I'd understand if he was addressing gaming market but he was addressing Pros. Some people just walked out and rightfully so. WTF is wrong with NVidia? Have they completely lost it. Can't wait for Intel and Chinese to enter GPU market. Such ridiculous should not be tolerated by consumers be it big data centers, designers or gamers.



Got a link? Or are you referring to GDC keynote?

Ah yes, the famous 10 gigarays speech  Saw that


----------



## unikin (Nov 14, 2018)

Vayra86 said:


> Got a link? Or are you referring to GDC keynote?



SIGGRAPH 2018 - NVIDIA CEO Jensen Huang - Reinventing Computer Graphics (I can't find his another presentation… It was way way worse. He didn't repeated twice, but 5 or more times).


----------



## Upgrayedd (Nov 14, 2018)

Hmm what about a dedicated tracing card without all the cuda and deep learning? A card strictly for RTX, like the old PhysX cards.


----------



## HTC (Nov 14, 2018)

Upgrayedd said:


> Hmm what about a dedicated tracing card without all the cuda and deep learning? A card strictly for RTX, like the old PhysX cards.



Interesting idea: would probably make the RT chips way smaller because they wouldn't require the tensor part (if i'm not mistaken), thus the chips could be much cheaper.

Would make nVidia much less money though because they wouldn't have "RT's excuse" to drive the prices high and most would buy the RT cards but not the "new RT addon card".


----------



## Vya Domus (Nov 14, 2018)

Prince Valiant said:


> I'm not understanding why people are trying to pass negative views off as jealousy or opinion.



I do, it's the easiest way to shut someone down without using much brain power. Of course that'll be the most prevalent argument. Remember kids when you disagree with someone just tell them they are haters and are being jealous . 14-year old level internet arguing techniques 101.



> but its quite clear most of the negative comments in this thread are motivated simply by out and out hardware envy.. tis a shame its so prevalent..



Textbook example.




Upgrayedd said:


> Hmm what about a dedicated tracing card without all the cuda and deep learning? A card strictly for RTX, like the old PhysX cards.



That would be so backwards, it would genuinely feel like going back 3 decades ago when the only graphics workstation out there had entirely independent boards that would fulfill separate roles, one reasterization , one texturing, etc.


----------



## trog100 (Nov 14, 2018)

stimpy88 said:


> Nice boast you got in there bro...  shrug it off, and blame the plebs for not being able to afford it.  Make you look like da MAN!
> 
> I wish I could be you...  Ah hang on, then I’d look like a trigger-happy thick idiot with an e-peen prematurely blasting its load on anything shiny kind of guy...  Seriously dude, not a good look.



i dont blame the plebs for not being able to afford it.. i do blame them for knocking it just because they cant afford it.. now if you think this aint happened on a grand scale not lot i can do about that.. 

trog


----------



## Easy Rhino (Nov 14, 2018)

INSTG8R said:


> So you’d be happy paying $1200 for a card that runs as well as a $300 card for a “feature” I’m not sure if your serious or just being obtuse. INOBODY wants to play an FPS at 30FPS, Now you’re really reaching. The card is already in a new realm of pricing this “feature” now that it’s exposed adds no value to it at all. I’m sure sales will double with these new benchmarks. Please you’re having a laugh



Again, what I think doesn't matter. This card clearly is not made for YOU or for others with budgets in mind. Why do you care so much how other people spend their money?


----------



## R0H1T (Nov 14, 2018)

HTC said:


> Interesting idea: would probably make the RT chips way smaller because they wouldn't require the tensor part (if i'm not mistaken), thus the chips could be much cheaper.
> 
> Would make nVidia much less money though because they wouldn't have "RT's excuse" to drive the prices high and most would buy the RT cards but not the "new RT addon card".


Something like that is coming, but still some ways away. Even AMD isn't going "chiplet" GPU anytime soon, though if I'd have to guess I'd say Nvidia, with their extra cash, will deploy something like that first.

AMD’s Navi will be a traditional monolithic GPU, not a multi-chip module

Just to be clear I'm talking wrt dedicated die for RT then probably "glued" together, in due course of time, with a regular GPU should the need arise for such a solution any time in the future.


----------



## HTC (Nov 14, 2018)

R0H1T said:


> That part is coming, but still some ways away. Even AMD isn't going "chiplet" GPU anytime soon, though if I'd have to guess I'd say Nvidia, with their extra cash, will deploy something like that first.
> 
> AMD’s Navi will be a traditional monolithic GPU, not a multi-chip module



I didn't mean it in an MCM type of way: i meant it as a full card. Think of it as a 2nd GPU, that has no display options and it's sole purpose is to enable RT with the primary GPU.

Ofc nVidia being nVidia would make sure such a card would only work with nVidia primary GPUs ...


----------



## R0H1T (Nov 14, 2018)

HTC said:


> I didn't mean it in an MCM type of way: i meant it as a full card. Think of it as a 2nd GPU, that has no display options and it's sole purpose is to enable RT with the primary GPU.
> 
> Ofc nVidia being nVidia would make sure such a card would only work with nVidia primary GPUs ...


Such accelerators, if you can call them that, will be more common yes.

Nvidia's bread & butter is GPU, they'll make every effort to make sure their solutions are less open than they can be.


----------



## rtwjunkie (Nov 14, 2018)

trog100 said:


> i dont blame the plebs for not being able to afford it.. i do blame them for knocking it just because they cant afford it.


Get off your epeen-ed high horse.  No one is knocking it because they can't afford it.  I bought my 1080Ti after these came out and I am plenty happy, just like @INSTG8R is happy with his purchase.  It's actually possible to have a negative view of something and it not be about envy.  The only ones who talk about envy are the ones that have more money than they know what to do with and like to brag.


----------



## sweet (Nov 14, 2018)

To be honest, if someone purchase a flagship like 2080ti for gaming, they either play on 4K or high refesh rate monitors. It is safe to say RT is utterly useless for them.

I myself bought some for those tensor cores, and might pull off a BF 5 run @1080p on the machine learning workstation in my lab. Maybe people like me are the target audience of RT


----------



## caleb (Nov 14, 2018)

Wizzard please comparison on 6 core 9600K vs 8700 vs 2700x on multiplayer ?


----------



## INSTG8R (Nov 14, 2018)

Easy Rhino said:


> Again, what I think doesn't matter. This card clearly is not made for YOU or for others with budgets in mind. Why do you care so much how other people spend their money?


Why is it you even think this has anything to do with others peoples choices or mine ? YOU made it about that. It has to do with NVIDIA hyping up a card priced higher than ever for its tier with the promise RTRT and not delivering anything close to worthwhile experience or performance. That’s what I care about. 

I’ll say this one more time slow for you. I’m very happy with my Vega my 1440 144hz Freesync monitor with HDR. All my tech works as advertised doesn’t kill my performance and was all  priced reasonably. 

I feel sorry for people like you who are apologists for poor products  and try to twist it about budgets or my needs Nvidia like# people like you and will continue to charge premium prices as people like you will find some way to justify getting screwed 
@trog I’m glad you’re happy with your purchase and that is wasn’t base-on RTRT capability at least you’re not like Rhino trying to justify your purchase but trust me nobody’s jealous.


----------



## Easy Rhino (Nov 14, 2018)

INSTG8R said:


> Why is it you even think this has anything to do with others peoples choices or mine ? YOU made it about that. It has to do with NVIDIA hyping up a card priced higher than ever for its tier with the promise RTRT and not delivering anything close to worthwhile experience or performance. That’s what I care about.
> 
> I’ll say this one more time slow for you. I’m very happy with my Vega my 1440 144hz Freesync monitor with HDR. All my tech works as advertised doesn’t kill my performance and was all  priced reasonably.
> 
> ...



I am sorry if you are taking this personally. I am doing my best to demonstrate just the opposite. You don't think it is a worthwhile experience therefore this card is not for YOU. Isn't that okay? Isn't it okay if someone enjoys spending that kind of money for ray tracing? Why are YOU the judge and jury on what a worthwhile experience is?


----------



## INSTG8R (Nov 14, 2018)

Easy Rhino said:


> I am sorry if you are taking this personally. I am doing my best to demonstrate just the opposite. You don't think it is a worthwhile experience therefore this card is not for YOU. Isn't that okay? Isn't it okay if someone enjoys spending that kind of money for ray tracing? Why are YOU the judge and jury on what a worthwhile experience is?


No it’s not  okay you’re passive aggressively trolling at this point. I’m not judge and jury the benchmarks are that is what I’m railing against. You re making it personal and you should really stop.
High price with poor performance on the feature they were trying to sell it on. If I was accepting of such things like you seem to be they will continue this trend which benefits no one but Nvidia and not the user.
What I take personally is you making it about me and not the product.


----------



## unikin (Nov 14, 2018)

Easy Rhino said:


> I am sorry if you are taking this personally. I am doing my best to demonstrate just the opposite. You don't think it is a worthwhile experience therefore this card is not for YOU. Isn't that okay? Isn't it okay if someone enjoys spending that kind of money for ray tracing? Why are YOU the judge and jury on what a worthwhile experience is?



I would be OK if Nvidia has given us a choice by building one classical GPU with double CUDA cores without RTX and RTX 2080TI for the same price and let the market decide which one it likes better. As things stand right now, we're getting no where near enough performance to push VR to 4K resolution levels or 4K 90-144Hz gaming nor usable frame rates at 4K with RTX on. That's why no one is happy. OK, maybe green fanboys who would buy any crap NVidia sells them and say it's great.


----------



## mouacyk (Nov 14, 2018)

This is complete utter market leadership shenanigan...


----------



## Vya Domus (Nov 14, 2018)

Easy Rhino said:


> You don't think it is a worthwhile experience therefore this card is not for YOU.



Believe it or not there are things out there that are objectivity not great regardless of what one feels on a per individual case.

It's safe to say in this case, it's not a great experience, objectively.


----------



## londiste (Nov 14, 2018)

I wish for once we could have a thread where we could discuss technical sides of the thing and would not devolve into meaningless bullshit.



cellar door said:


> @W1zzard - what about the power consumption? What happens to the clocks when you all of a sudden engage the RTX cores on the card.


I can answer that - power consumption is exactly the same. Turing cards are power limited and BFV runs the card into power limit with or without DXR effects. Clock speeds are also identical as well as temperatures.
It would be logical to assume RT cores take some performance away from rest of the GPU but that is not visible in clocks, power consumption or temperature.


iO said:


> Wow that's bad. Now imagine the performance if some dev tries to implement more RT stuff like shadows , AO or global illumination and not just shiny reflections...


Actually, shadows, AO or global illumination should be less of a performance hogs than reflections.


----------



## birdie (Nov 14, 2018)

@W1zzard


> It requires you to not only have Windows 10, but also the latest "October Update".



I'm tired of hearing that. DXR is now official with Vulkan 1.1.91 which works on Windows 7.

It's not NVIDIA's fault that most game engines default to D3D - some companies have actually the guts to use Vulkan like id Software.

Also, quite a lot of modern engines have Vulkan backends - so, this *Windows 10 requirement for DXR is plain BS*. For some reasons you also repeat this in all your RTX cards reviews.


----------



## londiste (Nov 14, 2018)

DXR = *D*irect*X* *R*aytracing
This is as standardized as raytracing can get for now.
Vulkan does have the Nvidia RTX extensions out of beta by now but I have seen some (considerable) scoffing towards that as these are still Nvidia extensions and not a "full native" Vulkan thing.


----------



## INSTG8R (Nov 14, 2018)

londiste said:


> DXR = *D*irect*X* *R*aytracing
> This is as standardized as raytracing can get for now.
> Vulkan does have the Nvidia RTX extensions out of beta by now but I have seen some (considerable) scoffing towards that as these are still Nvidia extensions and not a "full native" Vulkan thing.


Exactly this isn’t Nvidia “Special Sauce” it’s now part of DX12 No reason AMD can’t leverage it if they chose to.


----------



## trog100 (Nov 14, 2018)

the thing is some people feel the need to upgrade every so often.. i do and if i dont i lose all interest in whatever i am into at the time..

the current problem is an upgrade costs too much.. for me to upgrade my system i would be talking a couple of grand from where i was.. i dont need to upgrade to play games.. but to maintain my interest as an "enthusiast" i do.. i have other hobbies and the same principle applies.. stop buying new stuff and enthusiast wains..

its just how i am some would call it a progression which once it stops the game stops.. i think the same principle allies to most folks.. and i can also see why over expensive high end stuff creates haters.. their game stops too soon..

trog


----------



## Xzibit (Nov 14, 2018)

Only 1 RT RT features are being used in BFV and its causing a 50%+ performance hit.



			
				Nvidia said:
			
		

> Included with this update will be the first release of DXR *ray-traced reflections*


----------



## unikin (Nov 14, 2018)

trog100 said:


> the thing is some people feel the need to upgrade every so often.. i do and if i dont i lose all interest in whatever i am into at the time..
> 
> the current problem is an upgrade costs too much.. for me to upgrade my system i would be talking a couple of grand from where i was.. i dont need to upgrade to play games.. but to maintain my interest as an "enthusiast" i do.. i have other hobbies and the same principle applies.. stop buying new stuff and enthusiast wains..
> 
> ...



It's not just price, it's about price/performance ratio which is probably the worst deal in 30 years GPU history. If RTX 2080TI could offer +90 fps at 4-5K resolution, I'd buy it immediately for my VR headset. As it is it's only marginally faster than 1080TI (10%) when used with high res VR hmd and costs 70 % more. Here lies my disappointment. Much more expensive for very marginal improvement.


----------



## stimpy88 (Nov 14, 2018)

londiste said:


> ...power consumption is exactly the same. Turing cards are power limited and BFV runs the card into power limit with or without DXR effects. Clock speeds are also identical as well as temperatures.
> It would be logical to assume RT cores take some performance away from rest of the GPU but that is not visible in clocks, power consumption or temperature.
> Actually, shadows, AO or global illumination should be less of a performance hogs than reflections.



That’s an interesting point you have there...  I would like to know why this RT core, which is half the die, takes no power when active, and generates no heat, yet cripples the cards performance while RT effects are being rendered?  Odd indeed.

Is this RT core always active, and not power gated, or is it just marketing BS from nGreedia?  Could the real clock speed be severely reduced while the RT cores are doing their thing to keep power and thermals under control?  You would think given all of nGreedias marketing on this RT core, regarding its complexity and sheer speed, that it would result in an AVX kind of situation with power and thermals in the GPU.



Xzibit said:


> Only 1 RT RT features are being used in BFV and its causing a 50%+ performance hit.


We have heard that the Tomb Raider game is affected even more, and that they are having massive issues raising performance up enough.  They were apparently looking in to more granularity regarding how they were using RT effects to assist in this effort.  Could just be a rumour, but this was in the press not long after the RTX launch.


----------



## trog100 (Nov 14, 2018)

stimpy88 said:


> That’s an interesting point you have there...  I would like to know why this RT core, which is half the die, takes no power when active, and generates no heat, yet cripples the cards performance while RT effects are being rendered?  Odd indeed.
> 
> Is this RT core always active, and not power gated, or is it just marketing BS from nGreedia.  Could the real clock speed be severely reduced while the RT cores are doing their thing to keep power and thermals under control?
> 
> ...



i think it must work that way.. boost power is diverted to the ray tracing chips.. 

trog


----------



## Robcostyle (Nov 14, 2018)

That just damn terrifying....lurking through forums for the moment - and how many fanboys are there, protecting ngreedia, not listening anyone. Are they protecting their (barren) purchase? Okay, let 2080 Ti go with its ~60 FPS in 1080p, but what about 2080? Last, 2070 seems to be just a joke in any term.


----------



## Mistral (Nov 14, 2018)

So, the 2700 essentially can't manage it at 1080p. Even with a 2080Ti, I don't see many people enabling it in a multiplayer game. nVidia must be more selective whete they push this feature for the moment - it would work much better in a slower-paced single-player game, like a RE for example.

I guess it'll be at least two more generations before this hits mainstream in any meaningful way. And by then, hopefully instead of the current quality RTX, we will hopefully get something better, like Path Tracing.


----------



## stimpy88 (Nov 14, 2018)

trog100 said:


> i think it must work that way.. boost power is diverted to the ray tracing chips..
> 
> trog


This is my take on what’s going on in the chip, I could be very wrong, but it has to be managed somehow, if nGreedia is to be believed as to the implementation, size and computational performance of these RT cores, then it’s a miracle it comes with no power or thermal implications whatsoever.

Maybe this could be part of the performance issues here, and if there is headroom in the chip, as well as it’s thermal and power solutions, then it’s possible this load balancing could be altered to provide more performance during RT loads, depending on where the performance bottleneck is.

Maybe this load balancing would have to be on a per-game basis, or some kind of real-time governor implemented in the graphics driver.  Given the mess that nGreedias drivers have been with the RTX series so far, maybe they haven’t implemented it yet, or maybe it’s bugged?

It’s hard to believe that nGreedia would have dropped the ball this badly, given the time they have had to write drivers for these cards, as well as the fact that they release “game ready” drivers just for this first publicly available RTX title, but it’s possible...


----------



## Robcostyle (Nov 14, 2018)

Mistral said:


> So, the 2700 essentially can't manage it at 1080p. Even with a 2080Ti, I don't see many people enabling it in a multiplayer game. nVidia must be more selective whete they push this feature for the moment - it would work much better in a slower-paced single-player game, like a RE for example.
> 
> I guess it'll be at least two more generations before this hits mainstream in any meaningful way. And by then, hopefully instead of the current quality RTX, we will hopefully get something better, like Path Tracing.



Well, yeah, the problem is - bf raytracing is only a *part *- try including shadows and overall ambient lightning, and those card won't even make it in 1080p. Plus, take into account that BFV (and BF series at all) are fairly optimized - so there's additional penalty possible in other titles.


----------



## lexluthermiester (Nov 14, 2018)

rtwjunkie said:


> Except you really seem to have missed the dismal FPS on 1440 with RTX on.


I did not. Again I referenced turning off AA which would very likely offset the performance hit of jumping to 1440p. Hoping for an RTX/RTRT game release on GOG or Steam soon so I can run some benchmarks on my own system.


John Naylor said:


> Could easily have said the same about 4k as even a 1080 Ti struggles to maintain fps that justify having a $2500 144 hz IPS screen .... just not there yet. OTOH, it's a very good time with respect to this.


Exactly!


Easy Rhino said:


> people really don't understand how computationally expensive ray tracing is...


Or how amazing it is to do in real-time what took minutes and sometimes hours per frame just a few short years ago.


Prince Valiant said:


> Performance is poor and the numbers back it up.


Performance is acceptable. Above 30FPS is bare minimum playable. At or above 60FPS is good, which at 1080p is still good for most PC users. You need some cheese with your whine..


rtwjunkie said:


> Get off your epeen-ed high horse. No one is knocking it because they can't afford it. I bought my 1080Ti after these came out and I am plenty happy, just like @INSTG8R is happy with his purchase. It's actually possible to have a negative view of something and it not be about envy. The only ones who talk about envy are the ones that have more money than they know what to do with and like to brag.


While I think Trog has a good point, it can't apply to everyone bemoaning RTX. Personally, I like my 2080. Even without the RTRT aspect it's a great performer. I am excited to see what can be done with RTRT and what it can bring to the table, gaming wise.



INSTG8R said:


> No it’s not okay you’re passive aggressively trolling at this point.


Gonna have to agree with EasyRhino, you are being judgmental and condescending, effectively trolling everyone who wants or owns an RTX card. It's not cool. Just chill and let people enjoy what they want to enjoy.


----------



## unikin (Nov 14, 2018)

lexluthermiester said:


> Performance is acceptable. Above 30FPS is bare minimum playable. At or above 60FPS is good, which at 1080p is still good for most PC users. You need some cheese with your whine..



Are you serious? 30-40 fsp 1080p acceptable performance on $1.300 GPU? Do we live in the same Universe?


----------



## lexluthermiester (Nov 14, 2018)

unikin said:


> Do we live in the same Universe?


Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great games that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.


----------



## TheGuruStud (Nov 14, 2018)

lexluthermiester said:


> Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great game that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.



Patently false for a large number of ppl. If I wanted a migraine and to be extremely frustrated, I'd play at 30 fps. Also, you'd need adaptive sync to even see anything with all the tearing and stuttering going on.


----------



## mouacyk (Nov 14, 2018)

stimpy88 said:


> Is this RT core always active, and not power gated,


That is such a great speculation, and may end up being just how first-gen RT works and why it hogs power.  At the announcement, Jensen said something about having to fuse/integrate these new cores into the existing raster pipeline, in order to achieve the speedup that was marketed in his presentation.  Of course, the alternative would be to use the new cores in a separate (tracing) pass, but then you'd end up needing large and fast buffers to hold the intermediate work.  This adds complexity and latency, much like having RT on an add-on board.

Just think about how NVidia improved Simultaneous Multi-Projection in Pascal to 16 view ports in one pass, over Maxwell's 6, in order to accelerate VR rendering.  Add enough registers into the geometry transform stage and throughput is increased via SIMD-like instructions that transform the same scene geometry data through multiple view matrices in the same pass.  It is basically pipeline injection, because the entire render pass hasn't completed yet.  While latency is reduced compared to the alternative, it is harder to gate power.


----------



## unikin (Nov 14, 2018)

lexluthermiester said:


> Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great games that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.



$1.300 is a premium price so I have every right to expect elite performance from the product.


----------



## INSTG8R (Nov 14, 2018)

Give it up lex I’m not gonna argue with you too. the numbers speak for themselves THAT is what I am standing by.  If  your happy with 1060 performance for $1200 than I guess Nvidia has done its job and you’re pretty easily satisfied and thie tyoe of customer they were counting on yes NOW I’m going after an owner, you. Wanna be a tatgert fine. RTRT has basically been proven to be not ready for mainstream as I’ve said the numbers prove it. That would be the equivalent of me using my Vega at 720p to get playable FPS sorry that’s ridiculous. They marketed this card on this feature and it can’t deliver without MAJOR compromises and this is as basic RT as you can get. Tomb Raider is having even more trouble with performance so don’t expect this to get any better. The hardware just isn’t good enough. Turing 2 will hopefully correct that or else it’s going to remain in the Professional space a few years longer.
Only trolling here was Rhino  I’ve stuck to the facts not your fantasy of RT for all. We’ll get there but Turing is not going to be the one to make that happen. If we take RTRT out of the mix Turing is still overpriced but is a very capable card.



lexluthermiester said:


> Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. There are a ton of really great games that have a 30FPS limit built into them. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.


Oh give me break lex!! 30FPS in an open world MP FPS   I clearly do understand it and it’s too taxing for the available hardware. You’re diisplaying a sevrere lack of sense. NOBODY but a bandwagoner  like you would defend such poor performance trying to pedal 30FPS as acceptable for a game like this. It’s not acceptable on my Vega any more than it’s acceptable on a  2080 Accuse me of trolling  yet you expect people to swallow that BS? Stockholm  Syndrome already?


----------



## mouacyk (Nov 14, 2018)

Turing 2 better come in 7nm flavor, so they can squeeze more RT cores onto there because the increase is badly needed.  Power requirements may not go up drastically due to the shrink, but heat density will be a major challenge and will almost require water cooling to sustain current clocks.  Taking power and thermal limitations into consideration, Turing 1 is quite compromised, just not sold as such.


----------



## Vya Domus (Nov 14, 2018)

INSTG8R said:


> Oh give me break lex!! 30FPS in an open world MP FPS



Here's the real kicker: people on Xbox One are getting better performance that whoever chooses to enable RTX, *XBOX ONE.*

The PC master race has transcended many boundaries it seems.


----------



## INSTG8R (Nov 14, 2018)

Vya Domus said:


> Here's the real kicker: people on Xbox One are getting better performance that whoever chooses to enable RTX, *XBOX ONE.*
> 
> The PC master race has transcended many boundaries it seems.


Wow now that’s actually sad..


----------



## lexluthermiester (Nov 14, 2018)

unikin said:


> $1.300 is a premium price so I have every right to expect elite performance from the product.


In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games. Not everyone is buying the Ti model. I spent much less that $1000 for my 2080 and offset that cost with the sale of my 1080. Keep things in the proper perspective and you'll see the big picture. 

RTRT is brand new and it will continue to advance and evolve. In the mean time, non-RTRT gaming is getting big boosts in performance.


INSTG8R said:


> Give it up lex I’m not gonna argue with you too.


Ok then, don't.


INSTG8R said:


> the numbers speak for themselves THAT is what I am standing by.


Yes they do..


INSTG8R said:


> If your happy with 1060 performance for $1200 than I guess Nvidia has done its job and you’re pretty easily satisfied and thie tyoe of customer they were counting on yes NOW I’m going after an owner, you.


Thanks for the not to subtle insult. What I am happy with is what I mentioned just above, the big boost the 2080 gives to all existing games. I'm also happy to be an early adopter for this run of GPU's because I understand in reasonable detail how RTRT works and what is has to offer the future of gaming.


INSTG8R said:


> RTRT has basically been proven to be not ready for mainstream


Your opinion. Not everyone agrees. The benchmarks in this very review do not support that statement.


----------



## Xzibit (Nov 14, 2018)

PCWorld was doing a stream first impressions on BF5 RTX and not every reflective surface is reflecting objects properly or effects.


----------



## lexluthermiester (Nov 14, 2018)

Xzibit said:


> PCWorld was doing a steaming first impressions on BF5 RTX and not every reflective surface is reflecting objects or effects


Was just watching that. And was going to post it. Noticed a few things not being fully rendered. 

What I'm seeing there is a fully playable game experience, with settings on Ultra at mostly above 80FPS.


----------



## Xzibit (Nov 14, 2018)

lexluthermiester said:


> Was just watching that. And was going to post it. Noticed a few things not being fully rendered.
> 
> *What I'm seeing there is a fully playable game experience, with settings on Ultra at mostly above 80FPS*.



He is also avoiding any interaction with players to see the reflections.  Unless you like playing a Multi player game by  yourself looking at your reflection. Then its the perfect experience.

The earlier video I posted had much more interaction and the FPS was much worse and again this is a single RT RTX feature (just reflections).  Who know how much of performance hit will be added if any other effects are introduced.


----------



## atomicus (Nov 14, 2018)

I would like to think this will see these cards no longer selling, but alas that is probably wishful thinking. Until AMD get their act together, Nvidia are just sat there, arms folded, saying "Well, if you want a top end fast GPU what else you gonna get, eh?! Now give us ya money you muppets!"


----------



## INSTG8R (Nov 14, 2018)

lexluthermiester said:


> Your opinion. Not everyone agrees. The benchmarks in this very review do not support that statement.


Oh lex, please show me where I’m wrong. I know you love this bone and can’t put it down. Where are the acceptable performance numbers? The ones that show RTRT in a positive light. No sorry 30FPS is not positive. Being forced to play at 1080 to get acceptable FPS is not positive.  This game isn’t a walking simulator. There is no way to justify a 58% loss in performance just by turning DXR on. That’s the absolute definition of diminishing returns.

This is PhysX 2.0 big shiny bad performance.


----------



## unikin (Nov 14, 2018)

lexluthermiester said:


> In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games.



You shouldn't be comparing 1080 vs 2080 as for 1080 NVidia's MSRP was $549 and 2080 is $799. So real comparison is 1080 vs. 2070 (5-10 % increase in perf) and 1080TI vs 2080 (0-4 % increase in perf)… RTX 2080TI can be compared with Titan V ( 8 % faster on average). And btw. 1080TI vs 2080TI = +24 % at 1440p and 30% at 4K not 40-50% boost. So you're paying  $600-700 more for 27% increase in FPS.


----------



## lexluthermiester (Nov 14, 2018)

Xzibit said:


> He is also avoiding any interaction with players to see the reflections. Unless you like playing a Multi player game by yourself looking at your reflection. Then its the perfect experience.
> 
> The earlier video I posted had much more interaction and the FPS was much worse and again this is a single RT RTX feature (just reflections). Who know how much of performance hit will be added if any other effects are introduced.


He's demonstrating the effects on offer. I think it's cool looking.


----------



## Vya Domus (Nov 14, 2018)

INSTG8R said:


> There is no way to justify a 58% loss in performance just by turning DXR on.



Actually at 1080p if you disable DXR you gain a whopping 2.5x performance improvement. And mind you that's with a fairly CPU bound scenario.


----------



## lexluthermiester (Nov 14, 2018)

unikin said:


> You shouldn't be comparing 1080 vs 2080 as for 1080 NVidia's MSRP was $549 and 2080 is $799.


I'll compare what the heck I want thanks. I owned a 1080 and replaced it with a 2080 in the same machine. That's my comparison basis.


----------



## INSTG8R (Nov 14, 2018)

lexluthermiester said:


> I'll compare what the heck I want thanks. I owned a 1080 and replaced it with a 2080 in the same machine. That's my comparison basis.


I won’t deny A:RTRTis pretty neat but we’re at least a generation away from it being practical
B:if we take RTRT out of the equation it’s great card  
Just don’t try to pretend this is acceptable in its current form it’s purely proof of concept.


----------



## EarthDog (Nov 14, 2018)

What I've found in testing this game is the first two SP campaigns are pretty easy as far as RT goes. In fact, the GIGA 2070 used ran the first two over 60 FPS using an 8700K @ 4.7 GHz - where we run our benchmarks (67 and 64 FPS respectively). With RT off, it doubled to ~110-155 FPS. It is that last campaign next to the locked one that really cripples the cards. The water on the ground that is RT'd and whatever else... kills it. In that test, this card averaged 29 FPS. With RT off it pulled 48... so there is something up outside of RT in that level that really puts a hurting on GPUs REGARDLESS of RT being enabled in-game.


Outside of that, not jumping in the ball pit with the kids... I'll get sick. Its like a friggin pre-school in this place, LOL!

Anyway, moving on to the 2080...... then the 2080Ti.


EDIT: I sure would love for W1z (or someone who knows) to share which scenes were benchmarked for this and the previous review................


----------



## RealNeil (Nov 14, 2018)

New tech = Bleeding edge. 

I just can't see spending so much cash on something that is still in the process of teething. I'll give it some time to mature a little. (a lot)

The problems with some of these 2080Ti cards piss me off because prices on 1080Ti cards stopped dropping. (not good)


----------



## Shatun_Bear (Nov 14, 2018)

Needs another two generations at least before RT will take off, if ever. So GTX 4080. This performance is sh**e.


----------



## INSTG8R (Nov 14, 2018)

Shatun_Bear said:


> Needs another two generations at least before RT will take off, if ever. So GTX 4080. This performance is sh**e.


Yeah I’m quite interested to see what AMD plans to do with it. I’d think I’ve got a lot of idle Conpute units that would love to do the math but I’m sure performance would be just as bad. Also curious if when SLI is enabled it can leverage both cards.  That would be the only real viable setup with a decent balance of performance.


----------



## rtwjunkie (Nov 15, 2018)

lexluthermiester said:


> I'll compare what the heck I want thanks. I owned a 1080 and replaced it with a 2080 in the same machine. That's my comparison basis.


Lex is actually right on this. Price doesn’t matter on comparison until you do price vs performance.  A direct comparison between 2080 and 1080 is completely justified and is what should be compared as they occupy the same place within each generation.


----------



## Robcostyle (Nov 15, 2018)

lexluthermiester said:


> effectively trolling everyone who wants or owns an RTX card. It's not cool. Just chill and let people enjoy what they want to enjoy.



Let me just describe, why ppl are so pissed of, why you and your rtx friends are wrong, and etc.etc...

I don’t care how people spend their money. I don’t care, if they’re dining in fashion restaurant, buy expensive stuff or whatever. And I’m really glad if they have possibility to do so - UNTIL it starts touching ME. 
Stupid crowd keeps buying rtx card at insane prices even more vigorously than ever before -> huan has a clear vision, that he can twist the prices to the sky however he wants - we have stagnant market of GPU’s priced like a used vehicle. It’s just simple as that. That drives me mad. Your “wish” to pay 1500$ for gpu and fostering this price rise! gives ME no choice. Because it’s a monopoly. And I’m not even talking about fps/$, shitty PCB quality or dxr effects, no - pure economics.

We have no choice in top segment, and “the customer” is the only one who would make a change. But they won’t.
I still do remember, back at presentation, those idiots crying when huan said “Hey! 2080 Ti only from 999$!
Nobody even remembered about 1000$ being only titan privilege. That his mrsp price being a joke. Nobody even thought about monitoring ngreedia website, just to see the 1200$ price tag.
Like, huan telling about 999$ just in front of audience, while on the same time having 2080 Ti fe stored for 1200$? Lol, pathetic.


----------



## TheinsanegamerN (Nov 15, 2018)

HTC said:


> I didn't mean it in an MCM type of way: i meant it as a full card. Think of it as a 2nd GPU, that has no display options and it's sole purpose is to enable RT with the primary GPU.
> 
> 
> 
> Ofc nVidia being nVidia would make sure such a card would only work with nVidia primary GPUs ...


It might be a good use for NVlink, since SLI is completely dead at this point. You could also make the RT core significantly larger if it had its own GPU, potentially giving acceptable RTRT performance. 



lexluthermiester said:


> Clearly you simply don't understand real-time ray-tracing. And yes, 30FPS is acceptable. *There are a ton of really great games that have a 30FPS limit built into them*. They're still playable and still fun. You are displaying an elitist attitude. Kick that negativity down a notch or two and open your mind to the possibilities.


Go ahead, list them then. I'm expecting at least 20 games if there are "a ton" of them. It'd be a real shame if you could count the number of great 30FPS locked games on a single hand.

If I wanted stuttery 30 FPS gameplay, I'd buy a PS3. The whole point of gaming on PC is the flexibility to move past console limitations, not to pay more for a GPU to do 30FPS then many people do their first car.


----------



## lexluthermiester (Nov 15, 2018)

rtwjunkie said:


> Lex is actually right on this. Price doesn’t matter on comparison until you do price vs performance.  A direct comparison between 2080 and 1080 is completely justified and is what should be compared as they occupy the same place within each generation.


Exactly. The price for me wasn't a big problem as it was something I can afford. Honestly didn't want to afford the Ti model but could have rather easily. So the price wasn't a factor. After seeing the reviews and pondering the options the choice became simple.


TheinsanegamerN said:


> Go ahead, list them then.


Google is your friend.


----------



## mouacyk (Nov 15, 2018)

EarthDog said:


> Outside of that, not jumping in the ball pit with the kids... I'll get sick. Its like a friggin pre-school in this place, LOL!


Well -- you started it!  Great job on the testing and results. It's normally hard to bring the kids out to the playground, but this did it.


----------



## trog100 (Nov 15, 2018)

some folks dont seem to have a much of clue as to how business and life works.. a thing is worth what folks are prepared to pay for it.. a company is legally obliged to maximize profits for its share holders..

the value of money is directly related to how much of it a person has to chuck about.. and the bottom line is if you cant afford the ticket you dont get to ride on the boat..

i aint saying any of this is right but its how it is.. there are many thing in life i would like but cant afford but currently a high end graphics card aint one of em.. i aint rich but then again i aint as poor as some..

trog

ps.. thinking about it the 2080ti card i just bought was expensive but it cost less than the couple of 980TI cards i bought two or three years back..


----------



## mouacyk (Nov 15, 2018)

congratulations trog100

wasn't sure complacency makes enthusiasts also


----------



## INSTG8R (Nov 15, 2018)

trog100 said:


> some folks dont seem to have a much of clue as to how business and life works.. a thing is worth what folks are prepared to pay for it.. a company is legally obliged to maximize profits for its share holders..
> 
> the value of money is directly related to how much of it a person has to chuck about.. and the bottom line is if you ant afford the ticket you dont get to ride on the boat..
> 
> ...


But bowing to obvious increases that aren’t justified just makes it worse. It’s not about how you spend your money it’s spending it wisely. You’ve just sent a message to Nvidia that this price is now ok. Crypto has already ruined pricing and this is just rewarding Nvidia for no good reason. They have no direct competition so no need for these high prices but if you take this attitude it will only get worse.


----------



## trog100 (Nov 15, 2018)

mouacyk said:


> congratulations trog100
> 
> wasn't sure complacency makes enthusiasts also



i aint complacent just aware of how things are.. 

trog


----------



## Mescalamba (Nov 15, 2018)

DICE does it again.

Fking up another game.


----------



## trog100 (Nov 15, 2018)

INSTG8R said:


> But bowing to obvious increases that aren’t justified just makes it worse. It’s not about how you spend your money it’s spending it wisely. You’ve just sent a message to Nvidia that this price is now ok. Crypto has already ruined pricing and this is just rewarding Nvidia for no good reason. They have no direct competition so no need for these high prices but if you take this attitude it will only get worse.



you have it back to front.. the so called high prices are all about shareholders and stock prices.. the company has no obligation to its customers only in the sense it needs them as cash cows.. if they get it wrong and lose the cash cows they have f-cked up.. time will tell on that one..

trog


----------



## INSTG8R (Nov 15, 2018)

trog100 said:


> you have it back to front.. the so called high prices are all about shareholders and stock prices.. the company has no obligation to its customers only in the sense it needs them as cash cows.. if they get it wrong and lose the cash cows they have f-cked up.. time will tell on that one..
> 
> trog


It’s only worth what you’re willing to pay for it. If people waited to see RTRT in action they may have said this card isn’t worth the inflated price and Nvidia would have to adjust accordingly. If you’re just willing to rollover we’ll they just keep raising prices regardless of increases in value/performance because people like you don’t seem to care. I waited a long time to get my Vega and I still paid far too much for it. But I didn’t jump on the inflated prices just to have the new shiny or reward an unreasonable price tag.


----------



## trog100 (Nov 15, 2018)

i didnt jump on at first i decided the prices was too high for me.. but i was tempted by a special £150 off piece of email spam.. 

dont get me wrong here i do see your point but some people wont buy at the current prices but enough will buy .. 

but again you have the option of no supply or a price high enough to put enough people off to make sure there is at least some availability.. the bottom line being these things are not intended for the masses.. they really are for an elite few.. that few being those that can or will pay the high price..

trog


----------



## INSTG8R (Nov 15, 2018)

trog100 said:


> i didnt jump on at first i decided the prices was too high for me.. but i was tempted by a special £150 off piece of email spam..
> 
> dont get me wrong here i do see your point but some people wont buy at the current prices but enough will buy ..
> 
> ...


Oh I understand the temptation of sales. When I found my Vega I just jumped because it was the only one in the country. But with this series Nvidia has now successfully moved to goal post almost into Titan money and we know it will never go back.


----------



## trog100 (Nov 15, 2018)

INSTG8R said:


> Oh I understand the temptation of sales. When I found my Vega I just jumped because it was the only one in the country. But with this series Nvidia has now successfully moved to goal post almost into Titan money and we know it will never go back.



they have moved the goal posts.. the interesting thing is people never complained about titan prices they just knew it wasnt for them never bought it and left it at that..

maybe thats how the 2080ti card should be viewed a flagship product that exists but isnt really meant to be purchased.. as for this ray tracing stuff.. it clearly isnt ready for prime time but then again i am sure a pair of  flagship cards in sli mode would make a reasonable job of it.. for an elite few that will pay well over two grand for a gpu set up.. he he

trog


----------



## Xzibit (Nov 15, 2018)

trog100 said:


> they have moved the goal posts.. the interesting thing is people never complained about titan prices they just knew it wasnt for them never bought it and left it at that..
> 
> maybe thats how the 2080ti card should be viewed a flagship product that exists but isnt really meant to be purchased.. as for this ray tracing stuff.. *it clearly isnt ready for prime time but then again i am sure a pair of  flagship cards in sli mode would make a reasonable job of it.*. for an elite few that will pay well over two grand for a gpu set up.. he he
> 
> trog



You might be able to run 2 RTX effects at 1080p


----------



## INSTG8R (Nov 15, 2018)

Xzibit said:


> You might be able to run 2 RTX effects at 1080p


Yeah if that’s possible it remains to be seen They need to add SLI first.


----------



## EarthDog (Nov 15, 2018)

Somewhere is this cesspool of back and forth, a user mentioned they cannot run RT and SLI? Then someone else chimed


mouacyk said:


> Well -- you started it!  Great job on the testing and results. It's normally hard to bring the kids out to the playground, but this did it.


I have nothing to do with any results...here.


----------



## Nkd (Nov 15, 2018)

xkm1948 said:


> Haters gonna hate. Just like when people discounted hardware T&L, or pixel/vertex shading and etc. in a few years they will all shut up.
> 
> TBH first gen RTX was never meant to make RT RT playable for 4k 60fps. The Turing design is hybrid in nature. Hardware needs to wait for software to catch-up.  The following RTX will be more and more capable in RT RT until it becomes main stream.


heard this story before. Turing is what it is. First gen, early adopters got sold on those features. it will be the same way. RTX was always going to take performance hit. That is why Nvidia sold it on hope and knowing people won't know for a few months until they can use rtx well out of their return policy. It is what it is, it will take few generations.


----------



## R0H1T (Nov 15, 2018)

trog100 said:


> *some folks dont seem to have a much of clue as to how business and life works.. a thing is worth what folks are prepared to pay for it.. a company is legally obliged to maximize profits for its share holders..*
> 
> *the value of money is directly related to how much of it a person has to chuck about.. and the bottom line is if you cant afford the ticket you dont get to ride on the boat..*
> 
> ...


That's the argument some used for defending the top 1% & while many of them deserve to be there, there's still tons of them who've just weaseled there way into a life of luxury with virtually zero oversight i.e. tax dodgers.

Is that a life boat or a Yacht, ocean liner perhaps?

Well that's exactly what you're saying i.e. corporate greed is fine, so long as they can get away with it.

The checks & balances system works when people are willing to do something to stop excesses such as this. In a democracy the checks are performed by the electorate, in a capitalist society that *responsibility falls on the shoulders of consumers*.


----------



## INSTG8R (Nov 15, 2018)

EarthDog said:


> Somewhere is this cesspool of back and forth, a user mentioned they cannot run RT and SLI? Then someone else chimed
> I have nothing to do with any results...here.


Well SLI isn’t working without using NV inspector and the BF1 Profile so whether or not the 2 can work together would probably better tested with a legit profile. It could definitely offset the performance hit if it did work together.


----------



## hat (Nov 15, 2018)

Wow! Why is there a war going on between RTX ON/RTX OFF people here? Either you like it or you don't, enjoy your RTX (or lack thereof) and move on, please. No need to be attacking eachother for it.


----------



## GreiverBlade (Nov 15, 2018)

oh interesting .... a Vega 64 does more than well at 1440p


----------



## toyo (Nov 15, 2018)

Robcostyle said:


> Let me just describe, why ppl are so pissed of, why you and your rtx friends are wrong, and etc.etc...
> 
> I don’t care how people spend their money. I don’t care, if they’re dining in fashion restaurant, buy expensive stuff or whatever. And I’m really glad if they have possibility to do so - UNTIL it starts touching ME.
> Stupid crowd keeps buying rtx card at insane prices even more vigorously than ever before -> huan has a clear vision, that he can twist the prices to the sky however he wants - we have stagnant market of GPU’s priced like a used vehicle. It’s just simple as that. That drives me mad. Your “wish” to pay 1500$ for gpu and fostering this price rise! gives ME no choice. Because it’s a monopoly. And I’m not even talking about fps/$, shitty PCB quality or dxr effects, no - pure economics.
> ...


Pretty much.

I saved about 1500EU the last year, planning to upgrade to a 2080+9900K, didn't knew their names, but I knew 8 core Coffee Lake and some better, faster Pascal/consumer Volta cards were coming. The CPUs turned out to be horrible crap, the i7 transformed into i9 and the i5 into i7, with HT cut - what even the flying EFF?!? Oh. And it's 600EU or something in Europe /faint - after the 8700K I got was CHEAPER than a 1700x. I'm not giving Intel money for this.

And then the GPUs. Oh, they have new (old) stuffz. It's called RTX. Great. I should be able to buy a 2080ti with 700EU - whaatttttttttttt? It's 1300EU? 

You have got to be kidding me. 

I'll just keep the 1070ti. AND the money. Should i mention that when I got the 1070ti it was half the price of the cheapest Vega56? 

What people decide to buy is affecting each and everyone of us. The more people reward bad practices from near-monopoly big business that should be regulated and broken up, basically, the more the average consumer suffers.

But hey - we should just live and let live, and not say anything. 

NO.


----------



## bibob94 (Nov 15, 2018)

Just Noticed that Rx Vega 64 are faster than 1080 this time


----------



## ZoneDymo (Nov 15, 2018)

looks nice... sometimes, but liked linked earlier I Feel we could do this sorta thing way earlier with way less of a performance hit.
Just simpler but smarter more logical ways about this.

a what? 1200 dollar? 2080ti doing about 60fps on 1080p...and that is in a game that just came out, what about 2 years from now.
That is just unacceptable imo.


----------



## Valantar (Nov 15, 2018)

@W1zzard I asked @btarunr this in the original news post thread, but it doesn't seem to have made it through the noise, so I'll try again:

Ca you please do some power/clock speed monitoring while running RTX loads? Given that all RTX cards use 100% of their power targets doing rasterized rendering, I'm wondering just how much of this performance drop is due to power throttling of CUDA cores to allow the RT cores to work; and as a consequence, also how much power the RT cores require for their current performance levels. This would be an invaluable insight, and far more interesting than just pure performance numbers.

You wouldn't even have to do this across cards - just do simple a/b comparison with RTX on and off on a single card at a single detail level.


----------



## ONEoo7 (Nov 15, 2018)

Just upgraded to 1809 and bought bf5 especially to test out this new feature(RTX). It's impressive as it adds a lot to scene realism. I game at 1080p with an rtx 2080 so I was able to set the RTX to ultra. I didn't see any noticeable framerate decrease.

I like it how some people agree and some people disagree with this new tech, each have their arguments but especially those who disagree put a lot of frustration in describing their disagreement.

I can't say it's worth the money since it's only some eye candy added on top and as someone else mentioned it's a fast paced shooter, you wouldn't be able to tell the difference.

All in all I disagree with the price and how it was introduced but am also for evolution of technology. Is it going in the right direction? Only time will tell, and passionate engineers and scientists. As a personal opinion I like it and I wish it will be used in more titles, where it could really make a difference.


----------



## sutyi (Nov 15, 2018)

ONEoo7 said:


> Just upgraded to 1809 and bought bf5 especially to test out this new feature(RTX). It's impressive as it adds a lot to scene realism. I game at 1080p with an rtx 2080 so I was able to set the RTX to ultra.* I didn't see any noticeable framerate decrease.*



Unless you are on a 30Hz panel with V-Sync on, I highly doubt that...


----------



## Valantar (Nov 15, 2018)

sutyi said:


> Unless you are on a 30Hz panel with V-Sync on, I highly doubt that...


Yeah, dropping from 138 to 58 fps ought to be noticeable no matter your display. Even on a 60Hz display you'd notice a clear drop in fluidity and smoothness. I should know; I play Rocket League on a 60Hz display, locked to 120fps it's _far_ smoother than at 60fps, even if half of those frames are discarded.


----------



## phanbuey (Nov 15, 2018)

New tech is always great.  Personally, I'm glad i held off this round; but Ill definitely enjoy it once they get the kinks out.

I get that it's a technological marvel and whatnot, but from a gaming point of view, and after watching a boatload of "on vs off" videos -  it's just another eye candy element  - and a rather subtle one at that.

If I owned one of these cards, and I was playing this game at 1440P i would leave it in the 'off' position.


----------



## Xzibit (Nov 15, 2018)

Valantar said:


> @W1zzard I asked @btarunr this in the original news post thread, but it doesn't seem to have made it through the noise, so I'll try again:
> 
> Ca you please do some power/clock speed monitoring while running RTX loads? Given that all RTX cards use 100% of their power targets doing rasterized rendering, I'm wondering just how much of this performance drop is due to power throttling of CUDA cores to allow the RT cores to work; and as a consequence, also how much power the RT cores require for their current performance levels. This would be an invaluable insight, and far more interesting than just pure performance numbers.
> 
> You wouldn't even have to do this across cards - just do simple a/b comparison with RTX on and off on a single card at a single detail level.



I was interested in knowing that too.  So far this is the closest answer



			
				TechSpot said:
			
		

> It is interesting to note across these tests that *we are being RT core limited here*. The higher the resolution, the higher the performance hit using the Ultra DXR mode, to the point where playing at 4K is more than 4x faster with DXR off. This also plays out when we spot checked power consumption:* the cards were running at consistently lower power with DXR on*, *because the regular CUDA cores are being underutilized at such a low framerate*.


----------



## GreiverBlade (Nov 15, 2018)

phanbuey said:


> If I was playing this game at 1440P i would leave it in the 'off' position.


so do i but.... not with a RTX 20XX... rather with a Vega 64 given how it's dropping to my actual 1070 price (520ish $ at the time) and how a 2070 is also overpriced for no reason (650-790$ for me )  and how little difference there is in performance (RTX off ... ofc)  while the price difference is around 100 to 150$ ...



ah, 1 month and a half to wait ... harsh end of year for me


----------



## phanbuey (Nov 15, 2018)

The sad thing is, the amount of technology this takes... but humans generally overlook shadows & lighting - we don't wander around the real world paying attention to light interactions and shadows, if anything, we actively ignore those things so we can focus on what we're doing.

When a game is below the FPS level that I want, shadow detail and occlusion effects are the first things that get turned down.  I feel like to most people they just don't make a huge difference - as long as they're there and in semi-realistic detail we pay about as much attention to them in game as we do in the real world - as little as humanly possible.


----------



## ppn (Nov 15, 2018)

I don't get it. Why is the performance taking a hit. Any hit whatsoever. Because those RT cores are either making all the other cores sit idle 2/3 of the time, and we need 3x more of them to make up for the lack of RT power or they are only doing part of the job for RTRT and are offloading the rest to the Cuda cores. This is not good.
Turns out that DLSS on 4K is actually 1440p with upscaling and they call that a 35% performance increase when it is actually a drop, because going from 4K to 1440p should yield 100% performance increase. It is not doing DLSS for free, offloading all the work to the tensor cores, but it is cheating. Anyways I'm using 2070 instead of 1080 with RTX OFF DLSS OFF for the time being lol.


----------



## trog100 (Nov 15, 2018)

i just bought an "upgrade" card.. it just happened to have ray tracing.. a single 2080ti.. coming from a pair of 1070 cards in sli i only had a couple of options.. a single 2080ti or a pair of 1080ti cards in sli.. 

no way on this planet did i buy anything because it had ray tracing abilities.. i dont know how i fit in the general scheme of things as regards ray tracing but for me any upgrade option would have cost a lot ray tracing or not.. 

my next gpu upgrade will also cost a lot.. it will be another 2080ti to match the one i already have.. 

trog


----------



## ppn (Nov 15, 2018)

No trog you don't. When 7nm hits the market, you sell the 2080Ti for it is much slower than a ~~70 nextgen. just like 1070 and 980Ti. 970 beat the first TITAN Kepler.

The single RTX 3070 will make RTX TITAN 4608 Core 12GB Edition look like a toy, and buy a single card 3080 Ti.


----------



## bajs11 (Nov 15, 2018)

unikin said:


> If I'd pay $1.300 for GPU, I'd expect flawless RTX gaming at 4K not a slideshow. Maybe RTX 4080TI can deliver.


by then you will have to pay 4445 for the RTX 4080Ti, assuming the msrp will increase by 85% for each generation as it did when rtx 2080ti was released...
gtx 1080ti msrp 699 usd
rtx 2080ti msrp 1299 usd
rtx 3080ti msrp 2405 usd
rtx 4080ti msrp 4445 usd

I bet it will be sold out before it was released


----------



## Shatun_Bear (Nov 15, 2018)

phanbuey said:


> The sad thing is, the amount of technology this takes... but humans generally overlook shadows & lighting - we don't wander around the real world paying attention to light interactions and shadows, if anything, we actively ignore those things so we can focus on what we're doing.
> 
> When a game is below the FPS level that I want, shadow detail and occlusion effects are the first things that get turned down.  I feel like to most people they just don't make a huge difference - as long as they're there and in semi-realistic detail we pay about as much attention to them in game as we do in the real world - as little as humanly possible.



That's it, this is why RT on these cards right now is quite frankly, pathetic. Halves performance all for the sum total of....better reflections. And here's the kicker: you won't notice them as you stomp around in multiplayer. Worse; you'll very likely turn it off!

Hence Nvidia trying to charge a premium for RT is pathetic too.


----------



## evilpaul (Nov 15, 2018)

What are the thermals and power consumption like with RTX active?


----------



## rtwjunkie (Nov 15, 2018)

ppn said:


> No trog you don't. When 7nm hits the market, you sell the 2080Ti for it is much slower than a ~~70 nextgen. just like 1070 and 980Ti. 970 beat the first TITAN Kepler.


Wow, still peddling your BS, huh? Your example of the 1070 beating the 980Ti was a one-off, a singular event.

Two, you dont ever advise someone to keep buying down from whatever tier they are at. What happens is that in 2026 they end up with an RTX5030 entry level card that of course will decimate any of today’s games, but can only play new games that year in a slideshow.


----------



## EarthDog (Nov 15, 2018)

Any word yet on the scene/method tpu tested? I feel more annoying than normal asking for a 3rd time... but how many days should one wait for a simple answer? Why does this feel like it is some kind of secret?


----------



## unikin (Nov 15, 2018)

DXR = ON -> RTX 2080TI = GTX 1050TI


----------



## rtwjunkie (Nov 15, 2018)

EarthDog said:


> Any word yet on the scene/method tpu tested? I feel more annoying than normal asking for a 3rd time... but how many days should one wait for a simple answer? Why does this feel like it is some kind of secret?


That’s actually a good request.  People can compare their own results, as well as account for differences or similarities with other review sites.


----------



## spooh (Nov 15, 2018)

_we weren't aware DICE added another setting called "DXR reflections quality," which by default was set to "Ultra" (other settings include Low, Medium, and High)._

Do I get it right, the lower than Ultra settings were not tested?
So maybe "DXR Reflections Quality: Medium" looks OK with 60% less performance penalty?


----------



## ppn (Nov 15, 2018)

Lol the regular CUDA cores are under utilised and power consumption falls during RT ON, because not enough RT cores are present in the chip. They need 4 RT cores per SM of 64 CUDA cores, not just 1 RT core /SM64 as it currently is. Well that was a bad move Nvidia.


----------



## EarthDog (Nov 15, 2018)

rtwjunkie said:


> That’s actually a good request.  People can compare their own results, as well as account for differences or similarities with other review sites.


Right. This isn't a canned benchmark so nearly everyone is doing it differently I would imagine.

I used the @wiz thing twice... no response though he posted in threads after. I know he is extremely busy which is why I don't want to ping him again... yet, I, and am sure many others, would still like an answer.

Part of the reason I am asking is because in the SP mode, there are 3 campaigns which, in my testing, the first two (Under No Flag and Nordlys) aren't very hard on the card with RT enabled. In fact, With Ultra settings and RT enabled, I pulled over 60 FPS in them with a RTX 2070. The 3rd campaign (Tirailleur) in the forest with the water on ground KILLS the card (around 30 FPS). So I am wondering how he got those numbers. It LOOKS like it is an average of the three??? I don't know if....

A. My testing is off...
B. How this testing is done in the first place... all 3 scenes and an average?


The other thing is, I can't even play the god damn game now. I swapped GPUs and it doesn't work. I double click the icon, the bfv.exe starts in task manager, gets to around 216MB and quits. Was on chat with EA for over an hour yesterday and they escalated the issue. I can't even friggin play the game now........ wek sos. I recall W1z mentioning something about swapping parts and limits, but, I don't have a message or anything and one would think ONE of the 3 EA reps I chatted with would have picked that out as I intentionally mentioned it in each chat so they were aware.

EDIT: Put the 2070 back in and it works... WTF?!!!



spooh said:


> _we weren't aware DICE added another setting called "DXR reflections quality," which by default was set to "Ultra" (other settings include Low, Medium, and High)._
> 
> Do I get it right, the lower than Ultra settings were not tested?
> So maybe "DXR Reflections Quality: Medium" looks OK with 60% less performance penalty?


If you look at the results, you will see low/med/high. Pretty sure that is the RT tested there.

I don't know, so much confuses me (maybe its only me) in how this was actually tested here at TPU.......


----------



## OC-Ghost (Nov 15, 2018)

Nice review! Cool tech, won´t pay extra for it though.
On the game side of things, immersion is ruined in those screenshots instantly with realistic reflections that makes all the models look bad.


----------



## Vayra86 (Nov 15, 2018)

lexluthermiester said:


> In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games. Not everyone is buying the Ti model. I spent much less that $1000 for my 2080 and offset that cost with the sale of my 1080. Keep things in the proper perspective and you'll see the big picture.
> 
> RTRT is brand new and it will continue to advance and evolve. In the mean time, non-RTRT gaming is getting big boosts in performance.
> 
> ...



Your 2080 "big boost" was already on the shelves for two years, they called it a 1080ti and it was cheaper. It seems your 'big picture' needs to get a bit bigger than it is.



trog100 said:


> you have it back to front.. the so called high prices are all about shareholders and stock prices.. the company has no obligation to its customers only in the sense it needs them as cash cows.. if they get it wrong and lose the cash cows they have f-cked up.. time will tell on that one..
> 
> trog



Spot on! That is why many cash cows are now up in arms against RTX. See, you do get it, it just takes awhile.


----------



## lexluthermiester (Nov 15, 2018)

Vayra86 said:


> Your 2080 "big boost" was already on the shelves for two years, they called it a 1080ti and it was cheaper. It seems your 'big picture' needs to get a bit bigger than it is.


Maybe, but it didn't have RTRT. And that is something I'm excited about and looking forward too!


----------



## Vya Domus (Nov 15, 2018)

lexluthermiester said:


> And that is something I'm excited about and looking forward too!



I'm sure you are, you bought a 2080 after all. 

Your militant praise is staggering , after entire pages your still here replying to everyone with the same standard response, that RTX is great and we aren't enlightened enough to realize this astonishing feat that Nvidia have brought over to us.

Whatever floats your boat but you're wasting your time doing that, literally no one believes you. Not that your words surprise me, few would have the boldness required to admit that the rather expensive product that they bought isn't as stellar as they initially thought. Your determination to protect your purchase to the bitter end is admirable though.


----------



## Easy Rhino (Nov 15, 2018)

Well I have one final thing to say since I am being shouted down for having an opinion...

You ALL should be praising the people who are buying these expensive cards for the ray tracing tech. It is THOSE PEOPLE who spend their hard earned money on what you call "a poor experience" or "not worth the value" that are the reason this new tech is developed AT ALL. The early adopters, as dumb as many of you seem to think they are, PAY FOR THE rediculuous up front R & D costs of this tech. If you consider yourself a tech enthusiast then you know ray tracing is the future. If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!


----------



## trog100 (Nov 15, 2018)

those that think people are buying the latest generation nvidia cards because they are dumb enough to fall for the ray tracing hype have it wrong.. most will be just like me and simply buying a simple upgrade to their gpu systems.. at this moment in time i dont give a f-ck about ray tracing.. i dont think i am on my own..

their only crime is they can afford to do it.. he he.

trog


----------



## toyo (Nov 15, 2018)

Easy Rhino said:


> Well I have one final thing to say since I am being shouted down for having an opinion...
> 
> You ALL should be praising the people who are buying these expensive cards for the ray tracing tech. It is THOSE PEOPLE who spend their hard earned money on what you call "a poor experience" or "not worth the value" that are the reason this new tech is developed AT ALL. The early adopters, as dumb as many of you seem to think they are, PAY FOR THE rediculuous up front R & D costs of this tech. If you consider yourself a tech enthusiast then you know ray tracing is the future. If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!


Or you could pay FAR LESS shareholders and executives, that's always the better option. Also accept far less profit. 
Oh, I know, of course I know this goes against the core tenets of the capitalism religion, but I really don't care to be honest.


----------



## Vayra86 (Nov 15, 2018)

Easy Rhino said:


> Well I have one final thing to say since I am being shouted down for having an opinion...
> 
> You ALL should be praising the people who are buying these expensive cards for the ray tracing tech. It is THOSE PEOPLE who spend their hard earned money on what you call "a poor experience" or "not worth the value" that are the reason this new tech is developed AT ALL. The early adopters, as dumb as many of you seem to think they are, PAY FOR THE rediculuous up front R & D costs of this tech. If you consider yourself a tech enthusiast then you know ray tracing is the future. If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!



That is a valid argument, IF you are convinced the path chosen to get RT going is the right one.

Right now, there is overwhelming evidence (data!) that proves it is a dead end. We are looking at massive dies and massive shortcomings. This cant scale as it is without severely handicapped performance.

So I dont have anyone to praise, quite the opposite, the message we send out by buying at these prices is that we will take anything for granted as gullible fools. And Nvidia knows this and capitalizes on it. Praise them for playing you instead.


----------



## lexluthermiester (Nov 15, 2018)

Vya Domus said:


> I'm sure you are, you bought a 2080 after all.
> 
> Your militant praise is staggering , after entire pages your still here replying to everyone with the same standard response, that RTX is great and we aren't enlightened enough to realize this astonishing feat that Nvidia have brought over to us.
> 
> Whatever floats your boat but you're wasting your time doing that, literally no one believes you. Not that your words surprise me, few would have the boldness required to admit that the rather expensive product that they bought isn't as stellar as they initially thought. Your determination to protect your purchase to the bitter end is admirable though.


Are you done with the personal insults and virtue signaling? Just FYI, I bought my 2080 *AFTER* all the reviews had been done and everyone's perspectives, assessments and opinions were declared. I made an informed choice. Wrap your head around that one.



Easy Rhino said:


> Well I have one final thing to say since I am being shouted down for having an opinion...


Not everyone is shouting you down..



toyo said:


> Or you could pay FAR LESS shareholders and executives, that's always the better option. Also accept far less profit.
> Oh, I know, of course I know this goes against the core tenets of the capitalism religion, but I really don't care to be honest.


Or we bought a new card because it is the new hotness, does something completely new and performs better than the previous gen cards.. Hmmm..



Vayra86 said:


> Right now, there is overwhelming evidence (data!) that proves it is a dead end.


Citation required.


----------



## Vayra86 (Nov 15, 2018)

lexluthermiester said:


> Citation required.



Is that your oneliner now? When you are asked for a source you tell ppl 'Google is your friend'. The evidence is right in front of you on this website, dummy.


----------



## Vya Domus (Nov 15, 2018)

lexluthermiester said:


> Are you done with the personal insults and virtue signaling?





Easy Rhino said:


> If you want to to one day afford a ray tracing card that performs THE WAY "YOU" think it should, then you should be applauding all of the people paying the high price now. Because without them you WILL NEVER own a ray tracing card with your current attitude. Rhino out!



This is actual virtue signaling.


----------



## lexluthermiester (Nov 15, 2018)

Vayra86 said:


> Is that your oneliner now? When you are asked for a source you tell ppl 'Google is your friend'. The evidence is right in front of you on this website


This very review shows RTRT is a viable thing. The PCWorld Youtube video showed it to be a viable thing. Regardless of opinions, everything shown so far about RTRT on RTX shows it is progressing well. Your comment said; "Right now, there is overwhelming evidence (data!) that proves it is a dead end." Just not seeing that. Prove up. Let's see your logic and make sure it doesn't contradict what is already known.


Vayra86 said:


> dummy.


Enough with the insults or someone might give you some time-out.


Vya Domus said:


> This is actual virtue signaling.


You might want to refresh your understanding of what the phrase means.


----------



## Vayra86 (Nov 15, 2018)

lexluthermiester said:


> This very review shows RTRT is a viable thing. The PCWorld Youtube video showed it to be a viable thing. Regardless of opinions, everything shown so far about RTRT on RTX shows it is progressing well. Your comment said; "Right now, there is overwhelming evidence (data!) that proves it is a dead end." Just not seeing that. Prove up. Let's see your logic and make sure it doesn't contradict what is already known.
> 
> Enough with the insults or someone might give you some time-out.
> 
> You might want to refresh your understanding of what the phrase means.



I'll leave it to you to connect the dots, since you've got that bigger picture and all

Dot one: https://www.techpowerup.com/247005/nvidia-geforce-rtx-2080-ti-tu102-die-size-revealed

Dot two: 




Dot three: https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/4.html

Dot four: https://semiengineering.com/more-nodes-new-problems/

I'm sure you won't manage and resort to repeating everything you've said up till now. But try it anyway.

Key word and hint: SCALING.


----------



## lexluthermiester (Nov 15, 2018)

Vayra86 said:


> I'll leave it to you to connect the dots, since you've got that bigger picture and all
> 
> Dot one: https://www.techpowerup.com/247005/nvidia-geforce-rtx-2080-ti-tu102-die-size-revealed
> 
> ...


And how do those points equate to "dead end"? Please spell it out for this particular dullard, because your logic isn't exactly flowing like a river..


----------



## Vayra86 (Nov 15, 2018)

lexluthermiester said:


> And how do those points equate to "dead end"? Please spell it out for this particular dullard, because your logic isn't exactly flowing like a river..



No, no I won't. Ignorance is bliss and you're convinced its awesome.

I've added a fourth dot and a hint for you.


----------



## lexluthermiester (Nov 15, 2018)

Vayra86 said:


> No, no I won't. Ignorance is bliss and you're convinced its awesome.
> 
> I've added a fourth dot and a hint for you.


Yeah, let's leave the engineering to the engineers. Scaling won't be a problem for long, it rarely is. Die/wafer yield is a common problem that happens with almost every new advancement in IC development. They keep coming up with solutions and we keep advancing.


----------



## Vayra86 (Nov 15, 2018)

lexluthermiester said:


> Yeah, let's leave the engineering to the engineers. Scaling won't be a problem for long, it rarely is. Die/wafer yield is a common problem that happens with almost every new advancement in IC development. They keep coming up with solutions and we keep advancing.



OK. All is fine in the world!


----------



## 95Viper (Nov 15, 2018)

Stay on topic.
Stop the insults.
Have a civil discussion.

Thank You


----------



## Xzibit (Nov 15, 2018)

ppn said:


> I don't get it. Why is the performance taking a hit. Any hit whatsoever. *Because those RT cores are either making all the other cores sit idle 2/3 of the time, and we need 3x more of them to make up for the lack of RT power or they are only doing part of the job for RTRT and are offloading the rest to the Cuda cores. This is not good.*
> Turns out that DLSS on 4K is actually 1440p with upscaling and they call that a 35% performance increase when it is actually a drop, because going from 4K to 1440p should yield 100% performance increase. It is not doing DLSS for free, offloading all the work to the tensor cores, but it is cheating. Anyways I'm using 2070 instead of 1080 with RTX OFF DLSS OFF for the time being lol.



That would only be a solution if you have a static amount of rays. The higher the rez the more rays. These are only reflection in only part of the screen/objects most of the time.

72 RT cores are scraping by only doing bare minimal 1spp @ 1080p just for the same effects you need x4 (288) RT cores at 4k to get by on BF5 reflections alone.  If you want more detail on those reflections that number goes up quicker 2spp 576 RT cores just to improve detail in reflections which hardly will be noticeable.  They'll need to highly increase the sample rate in order to get noticeable detail.

We have to see Tomb Raider and Metro to see how those effects take a toll on performance but if TechSpot is right these cards are way too under powered.


----------



## Valantar (Nov 15, 2018)

Is it entirely naive of me to think this might indeed be an "early PhysX" type of situation, as in the only solution that makes sense is a secondary card? Nvlink should allow this to be tacked on with minimal latency, and it might not even need its own VRAM, but you could then separate the (obviously die space and power consuming) RT cores to a separate die, with its own power delivery and cooling. Seems like a good solution, and it ought to make the non-RT cards cheaper too.


----------



## trog100 (Nov 15, 2018)

Valantar said:


> Is it entirely naive of me to think this might indeed be an "early PhysX" type of situation, as in the only solution that makes sense is a secondary card? Nvlink should allow this to be tacked on with minimal latency, and it might not even need its own VRAM, but you could then separate the (obviously die space and power consuming) RT cores to a separate die, with its own power delivery and cooling. Seems like a good solution, and it ought to make the non-RT cards cheaper too.



which begs the question.. do people really want ray tracing or do they want cheaper without it.. 

trog


----------



## purecain (Nov 16, 2018)

a card dedicated to rtx might be a good idea. I could see a lot of people buying a dedicated rtx card to add visuals. thats if the whole die needs to be used to offer decent performance in 4k... either way, i'm sure they have a plan moving forwards... i'll be missing out on this years rtx effects. I shall look forward to the next generation. 
theres no point trying to pick fault with each others logic... if you had the spare cash and love technology, not upgrading would of been very hard.

 i'm still enjoying using the Titan V btw and bf V gives a smooth experiance at 4k... so im happy.


----------



## zenlaserman (Nov 16, 2018)

Color me old-fashioned, but I can't help but think RT in games to be good for anything but pretty screenshots.

What happened to the gamers that screamed "gameplay over graphics"? Are we going the way of the Dodo?
Taking a huge hit in gameplay fluidity for some pretty reflections that may or not be seen during action is silly, IMHO.

People with shiny stuff gonna look down their nose at people without tho.  But don't stop to admire the scenery...
..you just might get fragged by some guy without shadows turned on.


----------



## mouacyk (Nov 16, 2018)

Under-performing ray-tracing is NVidia's version of AMD's ROPS problem with Vega.  The root cause, as is common in both issues, is scaling as called out by Vayra86. All these cards are at the edge of allowable power consumption (and thus were heavily locked down.)  7nm should alleviate that somewhat if complexity in die-size doesn't stop it first, but the shrink itself presents a new issue of heat density (old in terms of Vega needing water cooling.)


----------



## Valantar (Nov 16, 2018)

zenlaserman said:


> Color me old-fashioned, but I can't help but think RT in games to be good for anything but pretty screenshots.
> 
> What happened to the gamers that screamed "gameplay over graphics"? Are we going the way of the Dodo?
> Taking a huge hit in gameplay fluidity for some pretty reflections that may or not be seen during action is silly, IMHO.
> ...


Well, better graphics _can_ improve immersion. Not likely in a fast-paced shooter, but there are plenty of games where more realistic graphics would directly improve the player's perception of gameplay through improved immersion. I'm betting Metro is going to make quite good use of this.


----------



## jabbadap (Nov 16, 2018)

mouacyk said:


> Under-performing ray-tracing is NVidia's version of AMD's ROPS problem with Vega.  The root cause, as is common in both issues, is scaling as called out by Vayra86. All these cards are at the edge of allowable power consumption (and thus were heavily locked down.)  7nm should alleviate that somewhat if complexity in die-size doesn't stop it first, but the shrink itself presents a new issue of heat density (old in terms of Vega needing water cooling.)



Well there is whispers on "the internet" that the power consumption is actually lowered with RTX on. Which means that cuda shaders are underutilized on rastering task, while gpu uses RT cores. So could you @W1zzard check the RTX card power numbers with RTX on vs RTX off?


----------



## Valantar (Nov 16, 2018)

jabbadap said:


> Well there is whispers on "the internet" that the power consumption is actually lowered with RTX on. Which means that cuda shaders are underutilized on rastering task, while gpu uses RT cores. So could you @W1zzard check the RTX card power numbers with RTX on vs RTX off?


Not only power, but also clock speeds. Given that Geforce GPU reviews very often focus on clock speed measurements due to GPU Boost, it really shouldn't be too hard to bring this to the table here too. Are the cuda cores idle but running at similar boost as without RTX on? Clocked low, but still not fully utilized? Power throttling? A combination of all three over time?


----------



## ksdomino (Nov 16, 2018)

This is an absolute nightmare!!  
I'm hoping it's all a mistake and those extra RTX cores that I paid (double what the card is worth) for are somehow not turned on. I expected a performance gain now it's come to light that these new feature cause a performance loss.
COMPLETELY UNACCEPTABLE!
No wonder they didn't show real game performance at launch.
I can't believe Nvidia did this to us.


----------



## Valantar (Nov 16, 2018)

ksdomino said:


> This is an absolute nightmare!!
> I'm hoping it's all a mistake and those extra RTX cores that I paid (double what the card is worth) for are somehow not turned on. I expected a performance gain now it's come to light that these new feature cause a performance loss.
> COMPLETELY UNACCEPTABLE!
> No wonder they didn't show real game performance at launch.
> I can't believe Nvidia did this to us.


Wait, what? It's been clear from long before cards were available that RTX would entail a performance loss. Some of the first commentary on Nvidia's promo material - across plenty of web sites - was "this is choppy, clearly below 60fps". Did you not look up any information whatsoever before buying?

I suppose there is a "performance gain" compared to trying real-time ray tracing _without RT cores _(which would likely get you <0.01FPS), but ... yeah, nobody's trying that.


----------



## ksdomino (Nov 16, 2018)

Valantar said:


> Wait, what? It's been clear from long before cards were available that RTX would entail a performance loss. Some of the first commentary on Nvidia's promo material - across plenty of web sites - was "this is choppy, clearly below 60fps". Did you not look up any information whatsoever before buying?



*"REAL-TIME"*
Everything that I read said "realtime"..Even the Nvidia website today still has multiple instances of "Realtime raytracing" performed by dedicated cores.
Nobody expected real-time to mean at lower-res (1080p is not acceptable at even half of this card's price-point) or at below 30 fps (which is also unacceptable for people that buy higher end cards) .
Choppy is not equal to real-time.

*"DEDICATED"*
There's nothing dedicated about those cores unless by "dedicated" they mean they are dedicating themselves to harming performance!

1080p is a step backwards.
30FPS is a step backwards.
50% performance hit (making a card (that costs over $1200) effectively perform like a card that costs $300) is unacceptable.
Charging people more than twice as much than the previous generation is capitalism at it's worst. If they keep driving prices up like this we'll all be screwed (even those that can currently afford it).
I've never been a fanboy of anything as I like to buy the best regardless of who made it but this is my last time buying anything Nvidia.
Just because no-one has competed with them, that shouldn't allow them to deceive, overcharge and exploit their customers and their position that much! I feel really ripped off! Sorry for the rant.


----------



## Valantar (Nov 16, 2018)

ksdomino said:


> *"REAL-TIME"*
> Everything that I read said "realtime"..Even the Nvidia website today still has multiple instances of "Realtime raytracing" performed by dedicated cores.
> Nobody expected real-time to mean at lower-res (1080p is not acceptable at even half of this card's price-point) or at below 30 fps (which is also unacceptable for people that buy higher end cards) .
> Choppy is not equal to real-time.
> ...


I entirely agree with parts of your rant (these prices are beyond absurd, and are frankly an insult to customers - as was 10XX series pricing above the 1070), but you seem desperately in need of a "read before you buy" lesson.

-There was plenty of press coverage going into the choppiness of Nvidia's demos and how this didn't bode well for actual performance of a first-generation product like this - particularly given how real-time ray tracing (regardless of resolution) hasn't been possible outside of massive render farms up until now. There were analyses of resolution and video frame rate demonstrating clearly that the video demos were 1080p at below 60fps. You seem to have made your purchase decision based solely on Nvidia's advertising. A hint: advertisers aren't neutral arbiters of truth. They're (only) interested in selling you stuff. Period. So, if you want to avoid being screwed over, wait for third-party reviews, and stick to trustworthy sources for said reviews, not "influencers" and other paid shills. And if you keep reading breathless articles praising this new and revolutionary technology that's going to enhance your games 9999999x, revive the dog you had when you were a kid and make you a millionaire just by existing, stop reading, go somewhere else, and scratch the source in question off the "trustworthy sources" list. Trustworthy journalists don't write like that.

-"Real time (rendering/ray-tracing)" means nothing more than frames being displayed immediately as they're rendered, as opposed to pre-rendering and displaying in sequence later. Of course, in practice, any form of effective real-time rendering requires a rendering output rate quick enough to maintain visual fluidity -  for which there are many standards; 15fps for old cartoons, 24fps for film, 30fps for console games (usually) and 60fps (or more) for anything fast-paced. Nvidia are not wrong when saying that they're doing real-time ray traced rendering - it just doesn't live up to the standards of current gamers in terms of frame rate or resolution. This means that the tech isn't ready for prime time, but not that it isn't real-time. And what do sensible people do with tech that isn't ready for prime time? Either let it mature before buying, or buy it knowing that it's going to suck, but suck in a new and innovative way, and one that you can afford to waste money on.

-"Dedicated hardware" means the hardware is meant for a single (or single group of) job(s). Are you suggesting the RT cores either don't exist, or are used for other things than processing ray tracing? 'Cause there's nothing to indicate either of the two. Having dedicated hardware doesn't automatically imply that said hardware is _good enough_. You seem to think "dedicated hardware" means something entirely different than what it actually means.

Were you ripped off? Price wise? Sure. Absolutely. Nvidia is price gouging, this is beyond any doubt. In terms of Nvidia overpromising? Possibly. This is first-gen tech, with one single implementation in real life. It may well improve, but it's pretty much a given that it'll suck to begin with. The thing is, nobody has _ever_ mentioned this improving performance, though, nor was the necessary drop in resolution a surprise to anyone paying attention.

Don't be gullible. Wait for reviews. Spend your money wisely, and get _proven_, not promised, value for money. Don't pay people for making grandiose promises. That's how all the failed Kickstarted games, game consoles, and other gaming-related vaporware were funded, after all.


----------



## ksdomino (Nov 16, 2018)

I agree, the world is now a "buyer beware" marketplace with companies like Nvidia and these shark practices.
Your post is entirely correct, also these guys know:


TheGuruStud said:


> Totally worth wasting all that die space! If only people were warned XD





Aldain said:


> As expected , utter shat..





unikin said:


> OMFG! This is even worse, than I thought. RIP Ray tracing, see you resurrected in a decade.





unikin said:


> If I'd pay $1.300 for GPU, I'd expect flawless RTX gaming at 4K not a slideshow. Maybe RTX 4080TI can deliver.


I'll wait next time and read more before buying.


----------



## purecain (Nov 16, 2018)

its just an early adopter thing... I wouldnt feel too sore. @ksdomino you still have a blazing fast gpu without rtx


----------



## GamersGen (Nov 17, 2018)

Can you enable somehow DXR on  1080ti or is it only available on RTX hardware?


----------



## Xzibit (Nov 17, 2018)

GamersGen said:


> Can you enable somehow DXR on  1080ti or is it only available on RTX hardware?



DXR has a Fallback Layer for developer purposes.

Not sure you want to even if it could be done. 1080 Ti would be between x3-x4 slower


----------



## Valantar (Nov 17, 2018)

Xzibit said:


> DXR has a Fallback Layer for developer purposes.
> 
> Not sure you want to even if it could be done. 1080 Ti would be between x3-x4 slower


3-4x? Really? If that's all the RT cores can do, that's not much. Usually the difference between dedicated hardware and not is far higher than that, but of course I haven't seen anyone actually try this. Still, I'd expect an absolute slide show, way below 1 fps. After all, if it was just 3-4x they could see about the same gains just by filling the die area of GT104 with cuda cores instead of RT cores.


----------



## enxo218 (Nov 17, 2018)

jensen huang is an excellent salesman...the end


----------



## HTC (Nov 17, 2018)

This round of cards (in this case, RTX cards) simply doesn't have enough "horse power" to fully utilize RT: perhaps they can achieve this with next gen of cards? Dunno, really.

What nVidia should have done was make *stronger card for non-RT workloads* and leave RT for a dedicated add-on card, which could also feature a much stronger RT capabilities. There would be several pros / cons with this approach:

Pros

- stronger non RT performance
- much smaller chips without the RT area
- possibly cheaper then current RTX cards
- stronger RT performance (????) due to dedicated card with a bigger RT core (than that of current RTX's cards)
- possibility of using RT @ greater than 1080p with 60+ FPS (minimums)

Cons

- necessity of a dedicated RT add-on card
- won't sell as much as a card that also has RT, like RTX cards do
- add-on card *likely* won't work with AMD GPU(s) present in the system
- much more expensive than current RTX cards due to being essentially two cards instead of one

Perhaps other pros / cons  that don't occur to me atm.


----------



## Valantar (Nov 17, 2018)

HTC said:


> This round of cards (in this case, RTX cards) simply doesn't have enough "horse power" to fully utilize RT: perhaps they can achieve this with next gen of cards? Dunno, really.
> 
> What nVidia should have done was make *stronger card for non-RT workloads* and leave RT for a dedicated add-on card, which could also feature a much stronger RT capabilities. There would be several pros / cons with this approach:
> 
> ...


I agree that that's how it should have gone, the problem is that nobody would have bought an expensive add-on card that didn't do anything at all in current games, and no developers would waste resources developing features with a 0-person install base for required hardware. One solution would have been bundling the two together, though that would increase the cost and exclude ITX builds, and you'd risk that people simply didn't install the RT card.


----------



## HTC (Nov 17, 2018)

Valantar said:


> I agree that that's how it should have gone, *the problem is that nobody would have bought an expensive add-on card that didn't do anything at all in current games*, and no developers would waste resources developing features with a 0-person install base for required hardware. One solution would have been bundling the two together, though that would increase the cost and exclude ITX builds, and you'd risk that people simply didn't install the RT card.



Agreed, to some extent: these cards would sell a heck-of-a-lot-less than nVidia would like but they'd still sell:

- enable RT @ far less details with existing cards, including 1080 generation cards (think 1/3 or less than current low RT, so it's achievable even without RT cores): this would give developers the "excuse" to have the technology available despite the lack of dedicated hardware (if you can call current RTX cards that)
- wait for games that have RT enabled and then launch the add-on card
- showcase the superior performance / quality with the add-on card

Played right, i think it would sell quite a bit.

It could sell a lot more if nVidia allowed it to be used with an AMD GPU, but i *seriously doubt* they do that.


----------



## Valantar (Nov 17, 2018)

HTC said:


> Agreed, to some extent: these cards would sell a heck-of-a-lot-less than nVidia would like but they'd still sell:
> 
> - enable RT @ far less details with existing cards, including 1080 generation cards (think 1/3 or less than current low RT, so it's achievable even without RT cores): this would give developers the "excuse" to have the technology available despite the lack of dedicated hardware (if you can call current RTX cards that)
> - wait for games that have RT enabled and then launch the add-on card
> ...


Such an add-on card would likely benefit quite a lot from NVlink (syncing up RT rendering with what the GPU is doing, possibly using the GPU's VRAM), so I doubt it would work with anything not new or anything from AMD. Of course, they could make two SKUs, one with NVlink and one with old-style SLI or even one working over PCIe - that could at least serve as an argument for people to upgrade to newer GPUs in time.


----------



## HTC (Nov 17, 2018)

Valantar said:


> *Such an add-on card would likely benefit quite a lot from NVlink* (syncing up RT rendering with what the GPU is doing, possibly using the GPU's VRAM), *so I doubt it would work with anything not new or anything from AMD*. Of course, they could make two SKUs, one with NVlink and one with old-style SLI or even one working over PCIe - *that could at least serve as an argument for people to upgrade to newer GPUs in time.*



If that was the case, than they'd have a legitimate excuse of not allowing an AMD GPU present in the system for it to work, but that would also restrict 1080 generation cards from using it, so dunno if that is a good plan, unless it's not doable otherwise due to lack of bandwidth or something like that.

Imagine if this could be used even with 1080: how many users are already with this card, or even the 1080Ti? All of a sudden, the user base could grow substantially so long as the card is reasonably priced.

For it to work, the add-on card must NOT help with anything other than RT capabilities, meaning it wouldn't boost non-RT frames for example, or it would eat up 2080 generation cards adoption.


----------



## FrodoLoggins (Nov 17, 2018)

It's not "Ultra Settings" when texture filtering is set to low...


----------



## W1zzard (Nov 17, 2018)

FrodoLoggins said:


> It's not "Ultra Settings" when texture filtering is set to low...


That's just a mistake in the settings screenshot, I clicked around too quickly congrats for being the first to notice


----------



## FrodoLoggins (Nov 18, 2018)

W1zzard said:


> That's just a mistake in the settings screenshot, I clicked around too quickly congrats for being the first to notice



If only my aim was as good as my eye  

Updating the image would be much appreciated.


----------



## EarthDog (Nov 18, 2018)

W1zzard said:


> That's just a mistake in the settings screenshot, I clicked around too quickly congrats for being the first to notice


Can you kindly take the time to share how this was tested? Like what scene? Apologies for chasing you around in threads, but I've asked a couple of times over the past several days without a response. Apologies if I missed one. 

@W1zzard


----------



## hat (Nov 18, 2018)

ksdomino said:


> This is an absolute nightmare!!
> I'm hoping it's all a mistake and those extra RTX cores that I paid (double what the card is worth) for are somehow not turned on. I expected a performance gain now it's come to light that these new feature cause a performance loss.
> COMPLETELY UNACCEPTABLE!
> No wonder they didn't show real game performance at launch.
> I can't believe Nvidia did this to us.


I mean, I'm not here to defend RTX or anything... but literally everyone has been telling everyone it was gonna run like shit since day 1. Even nVidia's own marketing couldn't pretty it up very much.


----------



## ksdomino (Nov 19, 2018)

HTC said:


> Agreed, to some extent: these cards would sell a heck-of-a-lot-less than nVidia would like but they'd still sell:


Not for $1300+ They wouldn't.
Nvidia needed a way to raise prices (to over double that of the previous generation). top-end Laptop prices have doubled over the last 2 years, top-end phone prices have doubled too, Even CPUs (Intel 7700k cost $300 on release, new 9900k cost $600). Nvidia felt they were missing out so they used "Real-time raytracing" to double their prices. Consumers be boned.


----------



## EarthDog (Nov 19, 2018)

7700K, with 4c/8t is half the cost of a 8c/16t CPU... makes sense to me in that light...Also, MSRP to MSRP is $305 and $488*. (where $488 is 1K tray price, I'd expect a bit over $500 is MSRP). 

While pricing has gone up with 'flagship' pieces,  should't ignore the differences between the two parts either like having double the cores/threads isn't worth it (to those who can use more than 4c/8t - which is more than one would think).


----------



## ksdomino (Nov 19, 2018)

EarthDog said:


> 7700K, with 4c/8t is half the cost of a 8c/16t CPU... makes sense to me in that light...Also, MSRP to MSRP is $305 and $549.
> 
> While pricing has gone up with 'flagship' pieces,  should't ignore the differences between the two parts either like having double the cores/threads isn't worth it (to those who can use more than 4c/8t - which is more than one would think).



Couldn't disagree more. for 2 reasons:
1. Msrp is a myth. Items never sell for Msrp.
2. Inflation.
Inflation states that on average, you would have to spend 2.70% more money in 2018 than in 2017 for the same item. In other words, $100 in 2017 is equivalent in purchasing power to $102.87 in 2018.

Technological advances (however limited or hindered by greedy companies) have occurred since the beginning of time. "cores", "hyperthreading", "Real time ratracing", [insert next big marketing phrase here] = at best a real world increase of 10-15% in performance. There is no way I'll accept that as justification for doubling of prices. Personally I can't believe you're defending these practices but to each their own, you're as entitled to an opinion as everyone else.


----------



## Valantar (Nov 19, 2018)

EarthDog said:


> 7700K, with 4c/8t is half the cost of a 8c/16t CPU... makes sense to me in that light...Also, MSRP to MSRP is $305 and $488*. (where $488 is 1K tray price, I'd expect a bit over $500 is MSRP).
> 
> While pricing has gone up with 'flagship' pieces,  should't ignore the differences between the two parts either like having double the cores/threads isn't worth it (to those who can use more than 4c/8t - which is more than one would think).


It used to work like this: technology made progress, things got better at the same price or cheaper for the same performance. This is mostly logical, as normally R&D costs are large, but stable-ish (they tend to rise, but not by 2x for one generation) fab costs decrease drastically per transistor with node changes (and more as the node matures), and other costs are negligible. This generation has some anomalies: no new process, so similar cost per transistor as 7th Gen. Twice the cores, so twice the transistors for that. No new arch, so no more IPC, but also no real R&D cost compared to a node shrink or new arch. The size increase and lack of R&D ought to even out in terms of total cost. Yet instead sales prices have increased dramatically. Something is off here, and it sure looks like Intel is padding its margins.


----------



## EarthDog (Nov 19, 2018)

One can't compare a CPUs MSRP versus another's current pricing (7700K and 9900K) either. 7700K was also found for $350-$400 after initial release mind you ... Similar markup by % (that new processor not in stock price), in fact. These prices will come back down _closer_ to MSRP as stock improves and the shiney new product luster wears off. 

That said, everyone can justify, or not, the addition of new technology towards higher pricing.  You don't have to accept it as justification, but you should at least consider the fact that it has double+ the processing power when all cores/threads are used. And like it or not, going with more cores/threads is the way the market is going (thank AMD for that one).

Now, I'm not happy about it either, don't misunderstand me, but the difference in performance between those two processors when using all the threads IS double.



Valantar said:


> It used to work like this: technology made progress, things got better at the same price or cheaper for the same performance. This is mostly logical, as normally R&D costs are large, but stable-ish (they tend to rise, but not by 2x for one generation) fab costs decrease drastically per transistor with node changes (and more as the node matures), and other costs are negligible. This generation has some anomalies: no new process, so similar cost per transistor as 7th Gen. Twice the cores, so twice the transistors for that. No new arch, so no more IPC, but also no real R&D cost compared to a node shrink or new arch. The size increase and lack of R&D ought to even out in terms of total cost. Yet instead sales prices have increased dramatically. Something is off here, and it sure looks like Intel is padding its margins.


Its a twist on 14++. There were some changes here. There are RnD costs. These new cores aren't 'glued' on to the processor and they work. So while it clearly isn't the same as a whole new arch, there are several factors involved.


----------



## Valantar (Nov 19, 2018)

EarthDog said:


> One can't compare a CPUs MSRP versus another's current pricing (7700K and 9900K). 7700K was also found for $350+ after initial release mind you ... Similar markup in fact by %.
> 
> That said, everyone can justify, or not, the addition of new technology towards higher pricing.  You don't have to accept it as justification, but you should at least consider the fact that it has double+ the processing power when all cores/threads are used. And like it or not, going with more cores/threads is the way the market is going (thank AMD for that one).
> 
> Now, I'm not happy about it either, don't misunderstand me, but the difference in performance between those two processors when using all the threads IS double.


So, going by that logic and working backwards, the fastest single-core CPUs before dual cores arrived were around $63? 'Cause that's where you end up if you think the price should follow the number of cores, and divide the 9900K's ~$500 by 8 (disregarding IPC entirely, of course).

In other words: this is not how things work, this has never been how things work, and shouldn't be how things work.


----------



## ksdomino (Nov 19, 2018)

EarthDog said:


> Now, I'm not happy about it either, don't misunderstand me, but the difference in performance between those two processors when using all the threads IS double.


I'm glad you brought that up since it allows me to point out that we seem to have gone a bit off the topic of RTX and to state that the doubling of Nvidia's pricing for the RTX 2080 ti is completely unjustified. I thought intel was greedy, Nvidia is the worst. The government says my wages have to go up by a minimum of 2.7% (inflation rate for 2018) but these companies are charging me double (Because there is no competition in the market) : That should not be legal It is in fact Illegal to "price fix" ( https://www.classlawgroup.com/antitrust/unlawful-practices/price-fixing/ ) but they get away with it somehow.


----------



## EarthDog (Nov 19, 2018)

Valantar said:


> So, going by that logic and working backwards, the fastest single-core CPUs before dual cores arrived were around $63? 'Cause that's where you end up if you think the price should follow the number of cores, and divide the 9900K's ~$500 by 8 (disregarding IPC entirely, of course).
> 
> In other words: this is not how things work, this has never been how things work, and shouldn't be how things work.


LOL, no.. just no.

That isn't how I think it works, I am just throwing in a different perception. The fact IS that there are 2x the cores and threads on this CPU versus the other one. There are MANY factors which go into pricing it. But to disparage the pricing and ignore any actual differences, isn't looking at the big picture.



ksdomino said:


> I'm glad you brought that up since it allows me to point out that we seem to have gone a bit off the topic of RTX and to state that the doubling of Nvidia's pricing for the RTX 2080 ti is completely unjustified. I thought Intel was greedy, Nvidia is the worst. The government says my wages have to go up by a minimum of 2.7% (inflation rate for 2018) but these companies are charging me double (Because there is no competition in the market) : That should not be legal (It is in fact Illegal to "run a monopoly" but they get away with it somehow).


It is legal.. NVIDIA isn't a monopoly. Not with RTG popping out frighteningly mediocre performing GPUs. Here is to hoping Intel's discrete entry can shake things up. 





I digress... RTX thread. My apologies.


----------



## Valantar (Nov 19, 2018)

EarthDog said:


> LOL, no.. just no.
> 
> That isn't how I think it works, I am just throwing in a different perception. The fact IS that there are 2x the cores and threads on this CPU versus the other one. There are MANY factors which go into pricing it. But to disparage the pricing and ignore any actual differences, isn't looking at the big picture.


Let's go big picture then: Intel gave us four cores (with HT most of the time) with minimal IPC improvements for a decade, and prices _never_ dropped. They essentially flat out refused to increase core counts in the MSDT market, despite customers asking for it, instead taking every opportunity to sell ever-cheaper silicon (minimal increases in transistor count, per-transistor prices constantly dropping, relatively flat R&D costs) at the same price point. Then they're suddenly faced with competition, double the number of cores in a year, change nothing else, and they raise prices by 50%. Your justification is short-sighted and cuts Intel far too much slack. They've made it obvious over the last decade that they're only interested in padding margins.

And Nvidia is exactly the same. As are most corporations, really, but the ones with near-monopolistic market positions make it far more obvious.


----------



## EarthDog (Nov 19, 2018)

Who asked for it? The general PC using population didn't ask for, want, or need it (more cores and threads)...quads have been out for nearly a decade and just now people are saying a quad with HT (for a gamer) would be the low end. 

But this what happens when there is little competition. AMD lulled Intel to sleep for  the better part of a decade with their sub-par performing architectures until Ryzen. This caused Intel to put out more incremental updates in IPC and clock speed performance. Since AMD went wide because they can't compete in clock speeds or overclocking headroom, this forced big blue to react and throw more cores/threads on their incremental updates. 

My justification and POV includes more than just that talking point (but again, this isn't the time and place for a deep dive). What is myopic is seemingly ignoring the fact that it doubled the amount of c/t with faster boost speeds and overclocking headroom. AMD shines in situations where it can use more threads for the same price. But falls to second place, of two, situations outside of that. Both processors (and GPUs, LOL) have a place in the market. Just be sure the measuring stick is the same size for each thing that is measured.

Cheers.


----------



## Valantar (Nov 19, 2018)

EarthDog said:


> Who asked for it? The general PC using population didn't ask for, want, or need it (more cores and threads)...quads have been out for nearly a decade and just now people are saying a quad with HT (for a gamer) would be the low end.
> 
> But this what happens when there is little competition. AMD lulled Intel to sleep for  the better part of a decade with their sub-par performing architectures until Ryzen. This caused Intel to put out more incremental updates in IPC and clock speed performance. Since AMD went wide because they can't compete in clock speeds or overclocking headroom, this forced big blue to react and throw more cores/threads on their incremental updates.
> 
> ...


This is getting too OT even for me, but I'll give one last reply: I can't answer for anyone else here, but I'm certainly not ignoring the doubling of cores/threads. My view on this is simple: it's about d**n time. Intel deserves zero credit for this, given that they've been dragging their feet on increasing this for a full decade. As for who has been asking for it, I'd say most of the enthusiast community for the past 3-4 years? People have been begging Intel to increase core/thread counts outside of HEDT for ages, as the increases in IPC/per-thread perf have been minimal, giving people no reason to upgrade and forcing game developers to halt any CPU-demanding new features as there wouldn't be an install base capable of running it. Heck, we still have people running overclocked Sandy Bridge chips and doing fine, as the de-facto standard of 4c8t being the high end and 4c4t being common has caused utter stagnation in game CPU loads.

Where the core count increase does matter is in the doubling of silicon area required by the cores, which of course costs money - but CFL-R is still smaller than SB or IVB. Due to per-area cost increasing on denser nodes, the total die cost is likely higher than these, but nowhere near enough to justify a 50% price increase. Why? Because prices up until then had remained static, despite production becoming cheaper. In other words, Intel came into this with already padded margins, and decided to maintain these rather than do the sensible thing and bring price and production cost back into relation. That is quite explicitly screwing over end-users. Personally, I don't like that.

Also, what _I _find myopic is how you seemingly treat corporate greed as a law of nature rather than what it is: corporate greed. There is no necessity whatsoever in Intel ceasing innovation and padding margins when competition disappeared. Heck, you go one step further and blame Intel's greed on AMD, which is quite absurd. Here's a shocker: Intel could very well have kept innovating (i.e. maintained the status quo) or dropped prices as production got cheaper, no matter whether they had competition or not. That would sure have increased sales, at least.  Instead, they chose to be greedy, and you're acting like that's not a choice. It is. And please don't come dragging the "fiduciary duty" crap, as there's nothing in that saying that you have to put your customers' wallets through a blender to fulfill that.

Nobody here is judging different products by different standards; quite the opposite, we're taking the whole context into account. AMD's recent rebound is more impressive due to how far behind they are - but in pure numbers, they're still slightly behind. However, they crush Intel on price/perf across loads, and roughly match them for price/perf in gaming. That's not bad. I judge Intel more harshly, as they're in a massively advantageous position, and yet have managed to completely screw this up. They're barely clinging to their lead despite having nearly a decade to cement it. That's disappointing, to say the least, but not surprising when they've prioritized squeezing money out of their customers rather than innovation. And again, that's their choice, and they'll have to live with it.

And I suppose that's how this all ties back into the RTX debacle - hiking up prices for barely-tangible performance increases and the (so far quite empty) promise of future gains, seemingly mostly to pad out corporate margins as much as possible.


----------



## EarthDog (Nov 19, 2018)

LOL, there is so much to reply too...... but I said I was leaving it alone (in this thread), and will.

One thing I do appreciate is a mature conversation without personal barbs. That is hard to find on TPU these days.


----------



## Xzibit (Nov 19, 2018)

Gamers Nexus did a deep dive into RTX in BF5


----------



## HTC (Nov 20, 2018)

ksdomino said:


> *Not for $1300+ They wouldn't.*
> Nvidia needed a way to raise prices (to over double that of the previous generation). top-end Laptop prices have doubled over the last 2 years, top-end phone prices have doubled too, Even CPUs (Intel 7700k cost $300 on release, new 9900k cost $600). Nvidia felt they were missing out so they used "Real-time raytracing" to double their prices. Consumers be boned.



You may have missed a big part of my post, dude: i was referring to a potential add-on card with RT capabilities and i said nothing about pricing.

nVidia could have made 2000 series cards with *quite a bit more "horse power" if it did NOT have the RT capabilities built in*, and it could quite possibly sell it for current 2000 series prices: they could actually "get away" with it because the performance uplift VS 1000 series would justify it.

Instead, the "gave us" RT capabilities that are evidently insufficient for today's high end gaming, unless you find acceptable the requirement of going from 4K @ 50 - 80 FPS  to 1080p @ 50 - 70 FPS in order to have partial RT (just reflections, for now).


----------



## ArbitraryAffection (Nov 20, 2018)

Oh dear. That doesn't even look _that good_. RTX 20 series is a joke. The hardware isn't ready for Real-time RT yet (it shows) and you're footing the bill for the huge die sizes because they are built on a node that will be outdated in 6 months. Worth it, though, right?

Right?


----------



## MirageFL (Nov 24, 2018)

So that's it... The revolution... Back to 90s ! Unreal Engine already made it with DX6 or DX7.


----------



## Aquinus (Nov 24, 2018)

All I have to say is that the moment DXR is turned on with a 2080 Ti, even on low, that performance is less than a Vega 56 at 4k. For a 2080, performance at low is practically the same as a 580. At 4k that makes the game practically unplayable. If you want to crank it up, you better have a 2080 Ti and be running at 1080p. To me, that's unacceptable for a graphics card that costs over $1,000 USD. Even more so if you consider this little quote from the review:


> It is very important to realize that DXR will not take over rendering of the whole scene. Everything in Battlefield V, with DXR enabled, is rendered exactly the same way, looking exactly the same as with the regular DirectX 12 renderer. The only exception are surfaces that are marked as "reflective" by the developer during model/level design. When one of these surfaces is rendered, it will be fed with raytraced scene data to visually present accurate reflections, that do look amazing and more detailed than anything we've ever seen before.



So remember, you're losing 50-60% of your performance so *some* elements of the world look better. That's no good, no good at all.


----------



## Valantar (Nov 24, 2018)

One thing we need to remember here is that hybrid RT is in its very infancy, and there is no doubt it will improve dramatically over time. There are undoubtedly undiscovered or untried tricks and workarounds to make this perform better, like rendering the RT scene at a lower resolution and upscaling with DLSS or hereto unknown ways of compensating for reduced ray counts. However, it seems unlikely that we'll see >100% performance improvements with current hardware, at least in the next couple of years, which is more or less what is needed to bring this to acceptable levels of price/performance. The previous generation brought us to solid 4k>/=60Hz, and now suddenly we're back down to half that, and worse performance than even two generations back even at lower resolutions. Add in the dramsimtic price increases, and we're looking at something that simply isn't acceptable. Making RTRT feasible at all is great, but what Turing really shows us is that it isn't really, and given the necessary die area and bleak outlook for future node shrinks on silicon, it might not be for a long, long time.


----------



## mouacyk (Dec 6, 2018)

Using





to sell




.  Well done.


----------



## purecain (Dec 9, 2018)

lol i'm sticking with my non rtx card for now...


----------

