• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 2080 / 2080 Ti Results Appear For Final Fantasy XV

Joined
Mar 29, 2018
Messages
590 (0.24/day)
as demanding as Final Fantasy XV is I was worried that these cards would struggle under it . now I would feel I got my moneys worth in a new 20 series [ that's if I was wanting or willing to buy one to start with ]
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,816 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
as demanding as Final Fantasy XV is I was worried that these cards would struggle under it . now I would feel I got my moneys worth in a new 20 series [ that's if I was wanting or willing to buy one to start with ]
Keep in mind FF XV is getting DLSS so essentially with tensor cores super sampling is free. So in theory less jaggies sharper cleaner image without performance loss so the performance gap in an apples to apples comparison will likely grow far larger. However I myself won't be buying an RTX series card. Sticking with my 1080Ti for the time being.
 
Joined
Sep 8, 2018
Messages
51 (0.02/day)
it would be nice to see some modern benchmarks with the latest drivers in modern AAA games with SLI and CF @W1zzard

its been awhile since anyone has shown any honestly.

that's because SLI is basically dead.

https://uk.hardware.info/reviews/8113/crossfire-a-sli-anno-2018-is-it-still-worth-it

Less and less Game support, dx12 doesn't help, poor performance gain.

you basically get 5-15% better total FPS by doubling GPU (1080tix2)in most games except a few outliers. 5-15% fps for 100% price I don't think its worth it. plus I quote them
"Furthermore, micro-stuttering remains an unresolved problem. It varies from game to game, but in some cases 60 fps with CrossFire or SLI feels a lot less smooth than the same frame rate on a single GPU. What's more, you're more likely to encounter artefacts and small visual bugs when you're using multiple GPUs as sometimes the distribution of the graphic workload doesn't seem to be entirely flawless."

So you get 5-15%, but you get a framerate that feels not smooth. better just spend the Extra cash on the best in line Single GPU to get the best performances.
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Well, I can understand you're used to the huge strides going from GCN 1.0 to 1.1 to 1.2 and so on. But Nvidia can possibly keep up with that.
Though rumor has it these were actually meant for 10nm, but there's no 10nm capacity available atm.

I'm also curious about how you derive performance from specs without knowing the frequency these chips run. Cause that alone can go ±25%.

The chips are maxed out on boost, which I wager is about the same as currents cards (1,900). Clocks don't do anything when they're already this high.
 
Joined
May 9, 2013
Messages
29 (0.01/day)
For more reference on a reasonably overclocked rig (1700X @ 4GHz / 2800MHz CL14 & 1080Ti @ 2113 / 12200MHz)

c750bb64-8a44-463b-bb84-a69f45e22d2e.jpg
bedfa991-350c-402f-93bf-f85f56a9fdda.jpg

b452d358-6c73-45d5-95f8-a83742ee744b.jpg

00bf44fa-c210-431d-b3ff-259130d2a086.jpg
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
The RTX 2080 offers a similar performance improvement over the GTX 1080 at 2560x1440, where it delivers a performance improvement of 28% and 33%
Too bad it costs like 1080Ti.
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
i'm sure these new cards will be the best in the market holding the crown but prices are too high due lack of competition

hope intel will develop faster their gpu and amd will throw something on market ...so previous gen card prices go lower than i'll upgrade but not sooner..

even now cheapest used 1080 i can find is ~400$ and rx580 ~300 (those used in mining rigs are slightly cheaper but over-killed maybe..)
 
Joined
Aug 6, 2009
Messages
1,162 (0.21/day)
Location
Chicago, Illinois
Most salty comments are coming from 1080Ti owners. We get it. You need to justify your purchase.

I'm trying to make sense of that but it's impossible. Can you explain what the heck you were saying?
 
Joined
May 29, 2013
Messages
65 (0.02/day)
that's because SLI is basically dead.

https://uk.hardware.info/reviews/8113/crossfire-a-sli-anno-2018-is-it-still-worth-it

Less and less Game support, dx12 doesn't help, poor performance gain.

you basically get 5-15% better total FPS by doubling GPU (1080tix2)in most games except a few outliers. 5-15% fps for 100% price I don't think its worth it. plus I quote them
"Furthermore, micro-stuttering remains an unresolved problem. It varies from game to game, but in some cases 60 fps with CrossFire or SLI feels a lot less smooth than the same frame rate on a single GPU. What's more, you're more likely to encounter artefacts and small visual bugs when you're using multiple GPUs as sometimes the distribution of the graphic workload doesn't seem to be entirely flawless."

So you get 5-15%, but you get a framerate that feels not smooth. better just spend the Extra cash on the best in line Single GPU to get the best performances.

I am pretty sure that for multigpunvidia are moving to NVLink, which if done properly will enable better multi-gpu scaling that is transparent to the games. NVLink is similar to multi CPU motherboards with NUMA memory arrangement. (It will add the gpu memories together, and the processors will work in paralel on a single frame)
 
D

Deleted member 177333

Guest
I'm not impressed at all. I'll wait for the 2019 cards, especially since 1080 ti overclocks like a beast, as well does reg 1080 and they come very close to those stock scores of 2080 and 2080 ti.

Right there with ya. Absolutely terrible performance when considering the price bump per model. I had been looking forward to the upgrade, but at these price points / performance increases...absolutely not. I'd like to think they'll sell terribly so the pricing will adjust itself into a more reasonable range given the performance increase, but no way of knowing how they'll do. If they sell well, the pricing will obviously stay up in the ludicrous range and I'll just wait until the 3000 series.
 
Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
Lol. Yes... we are salty because we need to justify our purchase. You definitely got it.

https://www.forbes.com/sites/jasone...-results-reveal-pricing-problem/#7732da865124

"The GTX 1080 boasts an average 50% better performance than the GTX 980 at 1440p across 7 synthetic and in-game benchmarks. And it rises to the 4K challenge, kicking out an average 65% higher framerates across those same tests compared to the GTX 980. And its entry price is only 9% higher than its 900 series predecessor."

This
 
Joined
Jan 2, 2008
Messages
3,296 (0.53/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
Neither. I'll sit on my 980Ti. Still gives me great performance for my gaming needs. 3 years out of it so far and it looks like I'll easily get another 2-3 years out of my card (as long as it doesn't die on me). If I sorely feel the need to upgrade, I'll just steal back the second 980Ti I have in my HTPC/Plex/Gaming computer and run SLI again.....curious how well that would be a few years from now....
If you are running on a 1080p that is. Some people already moved passed 1080p just like how we move passed 720p in 2008.
 
Joined
Apr 30, 2011
Messages
2,702 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Well, I can understand you're used to the huge strides going from GCN 1.0 to 1.1 to 1.2 and so on. But Nvidia can possibly keep up with that.
Though rumor has it these were actually meant for 10nm, but there's no 10nm capacity available atm.

I'm also curious about how you derive performance from specs without knowing the frequency these chips run. Cause that alone can go ±25%.

And here we are with the reviews' results that came out a few hours ago. Comparing custom 1080s and custom 1080Tis vs the FE-pre-oced 2080s and 2080Tis, we have less than 25% diff. And I predicted that sort of diff simply because bigger dies cannot clock higher enough than Pascal to get much more performance just from that. And GDDR6 was obligatory to not constraint the higher core count from giving the higher performance intended. So, with those cores available, that increase was the almost certain one. Efficiency and IPC increase is close to 0% after all.

So, much higher price for a much worse vfm cannot become a market success if customers are rational persons, eh?
 

bug

Joined
May 22, 2015
Messages
13,749 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
And here we are with the reviews' results that came out a few hours ago. Comparing custom 1080s and custom 1080Tis vs the FE-pre-oced 2080s and 2080Tis, we have less than 25% diff. And I predicted that sort of diff simply because bigger dies cannot clock higher enough than Pascal to get much more performance just from that. And GDDR6 was obligatory to not constraint the higher core count from giving the higher performance intended. So, with those cores available, that increase was the almost certain one. Efficiency and IPC increase is close to 0% after all.

So, much higher price for a much worse vfm cannot become a market success if customers are rational persons, eh?
You are still so wrong...

For a chip to be 25% faster than another on average it must be way faster than 25% when fully loaded. Simply because it cannot be any faster while neither is fully loaded. I'm just stated the obvious here, if you were truly interested in Turing (as opposed to the "I know without reading" attitude), you would have read Anand's in depth analysis.

And if the above seems overly complicated to you, think about cars: can you shorten the travel time between town A and B by 25% by using a car that's only 25% faster than the reference?
 
Joined
Apr 30, 2011
Messages
2,702 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
You are still so wrong...

For a chip to be 25% faster than another on average it must be way faster than 25% when fully loaded. Simply because it cannot be any faster while neither is fully loaded. I'm just stated the obvious here, if you were truly interested in Turing (as opposed to the "I know without reading" attitude), you would have read Anand's in depth analysis.

And if the above seems overly complicated to you, think about cars: can you shorten the travel time between town A and B by 25% by using a car that's only 25% faster than the reference?
Only results matter for performance in order to make a buy or not decision. And that is the main way to compare hw. Only if someone uses only a program or 2 that gains more that the average needs to check just those. And on average means that in other games there is smaller than 25% gains (mainly DX11 ones) and in others bigger than that (mainly DX12 or Vulcan ones as hw async compute is there now). Conclusion: VFM of Turing GPUs in their price state today is awful. And your arguments tend to make it worse for nVidia. It is a simple new gen on almost the same manufacturing procedure that cannot give more without making a bigger die. Simple as that. No need to anyone to spend so much now apart from those indifferent of cost to just have the fastest pc hw possible at any time. They will lose much money though as the Turing GPUs will have imho big price cuts in a few (2-3) months and till then RTX might be still missing.
 
Top