• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,566 (0.56/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
The thing is, that we don't get much of raytracing anyway. It's just some reflective surfaces that are being implemented. To get a true feel of raytracing, we need ten times the GPU power we have now. Which is why I say that this isn't ready for adoption. I completely agree with AMD saying that lower and mid ends also need to be at the level of being able to adopt it before they actually do.
This isnt a bad move by nvidia though, you need to start from somewhere or it never gets adopted in the first place, the pricing is insane but the tech is what we needed to begin with.
 
Joined
Mar 10, 2015
Messages
3,984 (1.11/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
IMO, Nvidia has painted themselves into a corner here. The RTX cards have no real performance advantage over their similarly priced predecessors, just a superficial step down in naming tier.

I don't think they painted themselves into a corner. They have to be Jedis to pull off this master of a release considering a lot of the cards are sold out...They even managed to sell their three year old cards at MSRP with crypto in the toilet? Seriously, they must be Jedis....or Siths?........hmmmmm

Raytracing in videogames is the step in photoreality we have been expecting for 15 years.

First, I don't have an RTX series and I have only seen pictures BUT.........what they are doing right now is not photo realistic at all and looks like absolute trash. Worse than not using RTRT. It looks like mirrors...everywhere....Like that, Master Mirror would. Yeesssssss.
 
Joined
Aug 6, 2009
Messages
1,162 (0.21/day)
Location
Chicago, Illinois
You know what the problem about Nvidia's "great idea" is? It's that they DID NOT make raytracing for gamers. Raytracing in games is merely a side project to them. The whole thing with raytracing and AI cores was for professional GPUs, especially for movie-making industry (where raytracing is BIG and everyone used CPUs for until now).
Then they came up with rebranding those GPU chips and making it a gaming feature despite the fact this tech isn't really ready for wide adoption. As a result you lose 50% of performance in exchange for making mirrors out of puddles of water and every other, even a little reflective surface.

Who said Nvidia MADE raytracing for gamers? Did someone say Nvidia made raytracing? Nvidia have STARTED IMPLEMENTING raytracing and it's definitely not great, the idea behind it is, I believe the intentions were good. Not to mention they may actually make it perform decent with upcoming drivers.


If it has RTX then I suppose you will be more than happy to game at 720p 30Hz for "only" $400. I however, won't.

But what has AMD go to do with this? I don't remember bringing them up...

I don't know anyone personally that wants to game at 720p 30hz period. AMD always has something do with with Nvidia since they both make GPU, we need AMD and Nvidia in order to create a competitive market for us consumers, it's just not really competitive right now.

IMO, Nvidia has painted themselves into a corner here. The RTX cards have no real performance advantage over their similarly priced predecessors, just a superficial step down in naming tier. Efficiency is also identical, though at idle they're worse thanks to the massive die. Then there are these massive dice in question, making them very expensive to produce. And we know RTRT isn't viable even on the 2070, so including it on the 2060 would be a waste of silicon. So, where do they go from here? Launch a 2060 without RT that matches the 1070, costs the same, uses the same power, but has a new name? If so, what justified the price for a 60-tier card? Why not just keep the 1070 in production? Or do they ditch RT but keep the tensor cores for DLSS? That would do something I suppose, but you'd still expect a hefty price drop from losing the RT cores. The thing that makes the most sense logically is to launch a card without RT that matches the 2070 at a cheaper price (losing both RT and normal gaming perf ought to mean double savings, right?), but then nobody would buy the 2070, as RTRT won't be viable for years.

Knowing Nvidia and the realities of Turing, they'll launch an overpriced mid-range card that performs decently but has terrible value.

I'll wait for Navi, thanks.

Ha, you can say Nvidia has painted themselves into a corner but the corner they painted themselves into happen to be a corner that's 75% of the whole area.

I think tensors cores are already adopted in folding proteins because my 2080ti folds 2 to 3x faster than the fastest 1080ti which makes pricing justified in folding rigs msrp vs msrp

My 1080ti folds for 850k no cpu slot
2080ti 2.6m no cpu

Awesome, that's a pro that doesn't get mentioned at all. First time I'm hearing about how powerful the 2080ti can be.

Maybe I am mistaken but isn't this like the nth time the "2060 showed up" in the same Final Fantasy benchmark ?



First of all the "idea" was always there, nothing revolutionary on that front. And the problem was the execution ? Then what portion of it was alright ? Might as well say it all failed in it's entirety.



It certainly is moving, slightly backwards. PC gaming is starting to look like the fashion industry, 1080p under 60fps is the new black.



I honestly wouldn't blame him, who wouldn't try to justify purchasing said product. But yeah the shirt must have been worth it.

How many prototypes do you have laying around that work right off the bat? How many of your prototypes do you have that are perfect right out of the build house? That's the point and others have made the point as well, you and others have to get it through your heads, we have to start somewhere.


The thing is, that we don't get much of raytracing anyway. It's just some reflective surfaces that are being implemented. To get a true feel of raytracing, we need ten times the GPU power we have now. Which is why I say that this isn't ready for adoption. I completely agree with AMD saying that lower and mid ends also need to be at the level of being able to adopt it before they actually do start implementing first features.
Attempting to actually run raytracing right now is just silly, there's a lot of work that needs to be done. Had Nvidia mentioned it and advertised it better it would be interesting to see how it all panned out BUT they tried to sell people on the 'feature', that's where they went wrong and people that bought into it are super salty I guess.


This isnt a bad move by nvidia though, you need to start from somewhere or it never gets adopted in the first place, the pricing is insane but the tech is what we needed to begin with.

"you need to start from somewhere"- Well that's just ridiculous! We all deserve the best right now and we don't expect any further advancements in this 'new' technology and 'feature'. /sarcasm

Unfortunately a lot of people just don't understand how things are made. You would think people would be a lot more understanding with all the revisions hardware goes through and especially on a tech forum but....nope!

I don't think they painted themselves into a corner. They have to be Jedis to pull off this master of a release considering a lot of the cards are sold out...They even managed to sell their three year old cards at MSRP with crypto in the toilet? Seriously, they must be Jedis....or Siths?........hmmmmm



First, I don't have an RTX series and I have only seen pictures BUT.........what they are doing right now is not photo realistic at all and looks like absolute trash. Worse than not using RTRT. It looks like mirrors...everywhere....Like that, Master Mirror would. Yeesssssss.

People like to believe that Nvidia are both the devil that they hate and are doing poorly with their sales. Sure they're not the most consumer friendly company and care more about the dollars they make BUT that hasn't slowed down sales of these failed cards and technology.

The majority of consumers that they're selling these cards(2080ti) to probably don't give a damn about what we say here on techpowerup or any other forum. They probably have lots of disposable income and buy a couple of cards. They might not even know how to enable raytracing in the one game or application that they can. :D
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,566 (0.56/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Who said Nvidia MADE raytracing for gamers? Did someone say Nvidia made raytracing? Nvidia have STARTED IMPLEMENTING raytracing and it's definitely not great, the idea behind it is, I believe the intentions were good. Not to mention they may actually make it perform decent with upcoming drivers.




I don't know anyone personally that wants to game at 720p 30hz period. AMD always has something do with with Nvidia since they both make GPU, we need AMD and Nvidia in order to create a competitive market for us consumers, it's just not really competitive right now.



Ha, you can say Nvidia has painted themselves into a corner but the corner they painted themselves into happen to be a corner that's 75% of the whole area.



Awesome, that's a pro that doesn't get mentioned at all. First time I'm hearing about how powerful the 2080ti can be.



How many prototypes do you have laying around that work right off the bat? How many of your prototypes do you have that are perfect right out of the build house? That's the point and others have made the point as well, you and others have to get it through your heads, we have to start somewhere.


Attempting to actually run raytracing right now is just silly, there's a lot of work that needs to be done. Had Nvidia mentioned it and advertised it better it would be interesting to see how it all panned out BUT they tried to sell people on the 'feature', that's where they went wrong and people that bought into it are super salty I guess.




"you need to start from somewhere"- Well that's just ridiculous! We all deserve the best right now and we don't expect any further advancements in this 'new' technology and 'feature'. /sarcasm

Unfortunately a lot of people just don't understand how things are made. You would think people would be a lot more understanding with all the revisions hardware goes through and especially on a tech forum but....nope!



People like to believe that Nvidia are both the devil that they hate and are doing poorly with their sales. Sure they're not the most consumer friendly company and care more about the dollars they make BUT that hasn't slowed down sales of these failed cards and technology.

The majority of consumers that they're selling these cards(2080ti) to probably don't give a damn about what we say here on techpowerup or any other forum. They probably have lots of disposable income and buy a couple of cards. They might not even know how to enable raytracing in the one game or application that they can. :D
FB_IMG_1542950658956.jpg

Untitled.jpg


First one was my 1080ti with a cpu combined score
 
Joined
Feb 18, 2013
Messages
2,186 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I don't think they painted themselves into a corner. They have to be Jedis to pull off this master of a release considering a lot of the cards are sold out...They even managed to sell their three year old cards at MSRP with crypto in the toilet? Seriously, they must be Jedis....or Siths?........hmmmmm
None of what you said here applies to what I was saying at all, but sure, Nvidia is very good at selling overpriced expensive gear. Still doesn't mean their current portfolio can extend downwards in any reasonable way.

Ha, you can say Nvidia has painted themselves into a corner but the corner they painted themselves into happen to be a corner that's 75% of the whole area.
Sure, it's not a small corner, but as I pointed out, there still isn't a sensible route out of it - no way of giving an actual performance upgrade to the millions who own 1060s (or 980s) for an acceptable price, so these people will just keep their current hardware.

When I bought my $700 Fury X in 2015, I didn't expect I'd have to pay significantly more to get a tangible performance upgrade nearly four years later. That doesn't make sense no matter how you look at it.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,988 (2.96/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
I'm pretty sure that this is going to be a cut-down RTX 2070 with 6GB 192-bit memory, since 2070 doesn't share the same die with 2080.

And x06 chips has been the -60 cards' chip for years, this time though it was for 2070 instead of the familiar x04 for -80 card.
 
Joined
Jan 8, 2017
Messages
9,508 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
How many prototypes do you have laying around that work right off the bat? How many of your prototypes do you have that are perfect right out of the build house?

You're funny. Don't try to pass these things as "prototypes", this ain't no prototype. It's a product out there, on the market. The even funnier thing is that not even Nvidia would agree with you, they claim Turing has been in development for 10 years, that's plenty of time to work out the feasibility of the features they wanted to include. A prototype, after a decade in development ? Wow, I am amazed by all the things you people come up with to put your favorite brand in a better light.

we have to start somewhere.

Of course, but get this : some starting points are better than others.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.11/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Still doesn't mean their current portfolio can extend downwards in any reasonable way.

Ah but it does. And it can. Watch these lower cards because if they had asses that is all you will see....headed out the door without your phone number. Every single one will sell. Regardless of the strengths/weaknesses.
 
Joined
Aug 6, 2009
Messages
1,162 (0.21/day)
Location
Chicago, Illinois
View attachment 111145
View attachment 111146

First one was my 1080ti with a cpu combined score

"IMPRESSIVE!"

Thank you sir. I just try to be logical and rational when thinking about and discussing things. Even in politics I usually don't have any bias,
others like to scream and raise pitchforks!

Sure, it's not a small corner, but as I pointed out, there still isn't a sensible route out of it - no way of giving an actual performance upgrade to the millions who own 1060s (or 980s) for an acceptable price, so these people will just keep their current hardware.

Okay, so just because you don't agree with pricing there is no way of an actual performance upgrade? What? Should people be made at Rolls Royce because they can't upgrade from their Honda Civic or Ford Focus? Your post sounds just as silly albeit not as extreme. If you want things you have to pay for them, the upgrades we want are rarely out of necessity, we just want better things.

You're funny. Don't try to pass these things as "prototypes", this ain't no prototype. It's a product out there, on the market. The even funnier thing is that not even Nvidia would agree with you, they claim Turing has been in development for 10 years, that's plenty of time to work out the feasibility of the features they wanted to include. A prototype, after a decade in development ? Wow, I am amazed by all the things you people come up with to put your favorite brand in a better light.



Of course, but get this : some starting points are better than others.

I used the term prototype hoping you might actually grasp what I was saying, it clearly didn't work. No they are not prototypes, yes these cards are out on the market BUT they're still in the early stages, the cards AND TECH in these cards are still in the early stages. 10 years is quite a long time, I'm sure they had BF V and ray tracing for all of those 10 years too, they just didn't even try I guess. Classic go to for some of you guys is to make someone seem like or call them a fan boy, I have an AMD powered laptop, a desktop with an R9 290 in it and have had a few of phenom processors. I am not the kind of person that has a favorite brand, I buy what I believe is best. Sorry to burst your bubble but I have no brand loyalty, well, unless you're talking about car audio, then I purchase all Digital Designs amplifiers and subwoofers.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Ah but it does. And it can. Watch these lower cards because if they had asses that is all you will see....headed out the door without your phone number. Every single one will sell. Regardless of the strengths/weaknesses.
Anything Nvidia makes sells. That's not being questioned. Uninformed people with too much money is not a scarce resource. What is being questioned is whether it'll be anything but a terrible value - which the current series at least doesn't bode well for. Matching the 1070's performance for the same price, but with a bunch of die area spent on RT cores that can't be used for anything at a playable resolution? I'd rather buy a 1070, thanks. Matching the 1070's performance at the same price while ditching RT entirely? Why bother, why spend the money on designing a new chip and not keep making the 1070? And what legitimizes the price not dropping more from the 2070 if it lacks RT entirely? The point is that the only sensible solution is to cut RT out entirely for this tier, and price it at a proper upper-midrange level, at around $300. Which is never, ever going to happen. Not only 'cause it leaves a whopping $300 gap in Nvidia's lineup, but because they've shown time and time again that they don't give a damn about providing any kind of value, and for this generation they seem laser-focused on keeping price-performance parity (or even an increase) with the previous one.
 
Joined
Jan 8, 2017
Messages
9,508 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I used the term prototype hoping you might actually grasp what I was saying, it clearly didn't work.

I know what you are saying, that these things are great under the hood and all their faults are to be overlooked because they are just first generation technologies that have great prospects. And yes, it didn't work because why would it ? Am I mandated to agree with that ?

If Nvidia would have waited just one more year, or perhaps even less than that, they could have come out with a product better equipped to fulfill the role that they wanted to boast about. After all everyone is convinced AMD wont have anything on the table for millennia to come and even if they would have had, it's not like it would have marked the end of Nvidia's massive market share. Yet, they didn't, they ventured on with features that clearly provide a poor experince.

So what was the hurry ? Once you start looking at in a slightly more than superficial way you realize this was indeed a bad starting point and, no, it wont have great prospects because of that.

I have an AMD powered laptop, a desktop with an R9 290 in it and have had a few of phenom processors. I am not the kind of person that has a favorite brand, I buy what I believe is best.

Nah, that argument never works. I have a 1060 and have been called more than a couple of times an AMD fanboy.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Okay, so just because you don't agree with pricing there is no way of an actual performance upgrade? What? Should people be made at Rolls Royce because they can't upgrade from their Honda Civic or Ford Focus? Your post sounds just as silly albeit not as extreme. If you want things you have to pay for them, the upgrades we want are rarely out of necessity, we just want better things.
Frankly, I have no idea what you're trying to say here. Technological development has always worked more or less like this: A top-of-the-line product exists with performance level X, at price level A. The required technologies improve, and a new product is launched, with added performance (perhaps X+y%), but stays at price level A. Slower products slot in below A, with ones matching the previous product's performance now being cheaper. This makes sense both in terms of production costs, economics of scale, maintaining sales, and driving customer interest, and you build and maintain an expectation that for spending A each generation, you get a y% performance increase, or if you skip two generations you get double that. Instead, this generation you get a side-grade, with zero added value.

What Nvidia is doing now is instead keeping subsequent generations at price-performance parity, so that for both generations, you get performance X at price A, with added price tiers above A for added performance. This kind of change only makes sense if the product in question is so revolutionary as to legitimize this price. RTX isn't, and first-gen RTX will never be - it isn't powerful enough, as evidenced by BFV. And the price levels they're operating at are high enough that they're never going to sell as many 2070s as they sold 1070s. No way, no how. Sales match price tiers very closely, so it's likely a 2060 priced to match a 1070 will also roughly match its sales. The 2080Ti likely sells more than the Titan Xp did, but only because it's the only tangible performance upgrade from the previous generation, and at that point you're in the "wealthy enthusiast" segment anyhow, where price matters far less than it does below $600 or so. This is not smart. Period.
 
Joined
Mar 10, 2015
Messages
3,984 (1.11/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
What Nvidia is doing now is instead keeping subsequent generations at price-performance parity, so that for both generations, you get performance X at price A, with added price tiers above A for added performance.

That is the whole point. They fully understand that they can sell turd for the price of gold. And they do. And people love it. They have made it to Apple status.

Customers: It's too expensive!
Nvidia: Well, that is because the fan shroud is made from Unicorn tears. They are sad because we strip mined their natural habitat so that we could put extra star dust into these RT cores.
Customers: Take my money! It's awesome!

They have hit the point to where what their product does or costs doesn't matter anymore. Only that it is new and shiny! And might have a new feature...that you might be able to use. Someday. But likely you will have to buy the next gen new and shiny thing.

I guess I should really say silver instead of turd as there is a decent non-rtrt increase (2080ti only).

Edited for spelling.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
That is the whole point. They fully understand that they can sell turd for the price of gold. And they do. And people love it. They have made it to Apple status.

Customers: It's too expensive!
Nvidia: Well, that is because the fan shroud is made from Unicorn tears. They are sad because we strip mined their natural habitat so that we could put extra star dust into these RT cores.
Customers: Take my money! It's awesome!

They have hit the point to where what their product does or costs doesn't matter anymore. Only that it is new and shiny! And might have a new feature...that you might be able to use. Someday. But likely you will have to buy the next gen new and shiny thing.

I guess I should really say silver instead of turd as there is a decent non-rtrt increase (2080ti only).

Edited for spelling.
You're probably not wrong. I guess they looked back at the 700-series (which also had no new process node or arch compared to the previous gen) and though "we didn't make enough money with that. How about we tack on some stuff that barely works and charge double for it?"
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
There's a reason this bench was chosen if anyone can figure it out...it's b/c the game is garbage.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
There's a reason this bench was chosen if anyone can figure it out...it's b/c the game is garbage.
I don't know if the game is, but the benchmark has been proven to be so a long time ago, yeah. Entirely unreliable.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,566 (0.56/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
I don't know if the game is, but the benchmark has been proven to be so a long time ago, yeah. Entirely unreliable.
The benchmark isnt bad at all, the scores can vary because of cpu and memory just like 3dmark, from experience its a reliable benchmark
 
Last edited:
Joined
Dec 31, 2009
Messages
19,372 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Last edited:

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,566 (0.56/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
3dmark is pretty consistent/repeatable. It also responds to other system changes like CPU speed, cores/threads, memory speed/timings. For those, the cpu score (as you guys know) plays a role in the overall.


RE: FF XV - https://www.gamersnexus.net/game-be...ous-misleading-benchmark-tool-gameworks-tests

For comparing it to the game, it's a poor bench. I dont think it was fixed(?) but a DLSS version came out! :p
its consistent if you do the presets like 3dmark, and there really is no difference except ff15 benchmark looks better.

gamers nexus did custom presets, and ruined the rep of the benchmark. atleast in my thread its an even playing field where your score is only allowed under standard 1080p
https://www.techpowerup.com/forums/threads/post-your-final-fantasy-xv-benchmark-results.242200/
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The benchmark isnt bad at all, the scores can vary because of cpu and memory just like 3dmark, from experience its a reliable benchmark
Exactly correct.
A benchmark that doesn't do proper object culling and renders offscreen objects at all times is bad. Period. Poorly developed, poorly implemented, poorly optimized.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Is there any confirmation of whether or not it's GTX 2060 or RTX 2060?

The difference would be RTX has RT capabilities while GTX hasn't. If it doesn't, will nVidia still stick to the RTX naming scheme anyway?
No, and I wouldn't read too much into the naming, even if this benchmark itself is legit. In nearly all cases, unreleased cards will show up with just a device id in pre-release drivers. Even if someone have gotten their hands on an engineering sample or even a qualification sample identical to the upcoming product, it's not going to display the final product name. This is something people should keep in mind when reading rumors about any GPU, both from AMD and Nvidia.
 
Top