• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V with RTX Initial Tests: Performance Halved

Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
The negativity surrounding RTX is unbelievable. Technologically, this is probably the biggest step up we’ve had in the history of the GeForce cards.

When the Voodoo was released was there widespread support for Glide? No. When the GeForce 256 was released was there widespread support for hardware T&L? No. When the GeForce 3 came out was there widespread support for programmable shaders? No. Why do you expect any different from this?

So far we do not have any games using an engine built from the ground up for ray tracing support. Ray tracing has been patched onto rasterization engines which were never designed for ray tracing.

Software needs to catch up to the hardware. Look at any launch title on console and compare the graphics to something a few years down the line. The hardware didn’t change, but the software caught up.

A friend of mine gave me a brilliant analogy last night. When a baby takes its first steps do the parents say “oh that’s crap, he’s so slow and unstable” or are the blown away that their little one is making such good progress?

The same will be true of ray tracing. For the first time we are looking at image generation in game differently and progress can only go one way. Wait until we have engines written with ray tracing in mind from the get go.

If you wanted all-wheel ABS in 1978, Mercedes offered it for around $ 32,000 BACK THEN. That was your only option. Now it’s found on almost every entry level car regardless of price. Give ray tracing a bit of time to mature. There will never be a “right time” for the initial release as without the initial release there will be no further progress.

Appreciate the technology for what it is and the revolutionary (as opposed to evolutionary) change it can bring.

So what if the RTX 2080 Ti gets 400 FPS instead of 180 FPS using traditional render methods? Both are beyond the level of perception of the human eye so it becomes an arbitrary figure. What we need is a way to drastically increase image quality without the performance hit we would have had prior to the RTX cards, which is now a reality. Don’t base the small increase in quality on a badly patched rasterization engine.

We now have the computational power that until not very long ago required a render farm packaged into a single GPU with a price tag that high end enthusiasts can afford. Show me one other graphics card that offers that?
Wow, you really like to write essays don’t you! Or is is a copy and paste from the nGreedia NPC software update?

Maybe you should work on trying to comprehend for yourself why people don’t like this situation before you write another marketing rant?
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,291 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Could the RT cores be memory limited? They seem to be from 1080p teaser image..

Don't know about that yet. The teaser graph contains data from RTX 2080 Ti, which already has the highest bandwidth in the market. After all this is over, we'll see how memory OC affects this.
 
Joined
Mar 10, 2015
Messages
3,984 (1.12/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Forget the performance, I don't even like how it looks it many areas. Anyone with half a brain saw this coming because it was reported that it killed FPS so no surprise. RTX is still some time away but at least that $1300 bought some 30% performance increase without the RTX that will have to be disabled so it wasn't a complete waste of people's money.
 
Joined
Jul 18, 2016
Messages
519 (0.17/day)
System Name Gaming PC / I7 XEON
Processor I7 4790K @stock / XEON W3680 @ stock
Motherboard Asus Z97 MAXIMUS VII FORMULA / GIGABYTE X58 UD7
Cooling X61 Kraken / X61 Kraken
Memory 32gb Vengeance 2133 Mhz / 24b Corsair XMS3 1600 Mhz
Video Card(s) Gainward GLH 1080 / MSI Gaming X Radeon RX480 8 GB
Storage Samsung EVO 850 500gb ,3 tb seagate, 2 samsung 1tb in raid 0 / Kingdian 240 gb, megaraid SAS 9341-8
Display(s) 2 BENQ 27" GL2706PQ / Dell UP2716D LCD Monitor 27 "
Case Corsair Graphite Series 780T / Corsair Obsidian 750 D
Audio Device(s) ON BOARD / ON BOARD
Power Supply Sapphire Pure 950w / Corsair RMI 750w
Mouse Steelseries Sesnsei / Steelseries Sensei raw
Keyboard Razer BlackWidow Chroma / Razer BlackWidow Chroma
Software Windows 1064bit PRO / Windows 1064bit PRO
"The more you buy, the more you save "......
 
Joined
Nov 29, 2016
Messages
671 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
I hate these "patch" that add technology to a game. If the game was not design with this built in, the performance is not going to be good. Just look at all the games that have DX12 as a "patch" vs. games that are developed with DX12 natively.
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,049 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Fist of all to make this conclusion you have to compare the graphical quality of 1080p with RTX enabled and 4K without RTX enabled . As far as i know we don't have such a comparison so you can't make such a conclusion . Don't get me wrong , maybe in some early games the difference won't be worth the hassle , maybe on others it totaly will but as time goes real time ray tracing will for sure show superior results than rasterized rendering since the technology is simply superior .

You say that it is not ready for prime time , but how do you make it ready for prime time if nobody owns such a product ? Do you think game developpers will develope technology for a phantom product ?
The answer is obvious !
Well “regular” BF5 performance numbers are readily available so I’m not sure why you’re trying to split hairs 2080ti gets 83 FPS at 4K Ultra vs the initial numbers of 65FPS at 1080 Ultra the only graphical difference being the addition of RT How can I not form a conclusion from that? Do you not think that the DICE Dev team aren’t all using RTX products for dev work? This is as good as it can be in its current form with the hardware available.
Let’s be clear I’m not hating on RT I’m saying it’s not ready for the consumer space with current hardware. Like all new tech it takes a generation at least before it becomes readily accessible.
 
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Maybe you should read my full post before replying, such as:









You too are judging an entire technology based on one game which was not designed for ray tracing.

Calling RTX anything less than revolutionary is an insult to every engineer who worked on it. Just because you don't understand it or we don't have the software available to take full advantage of it doesn't take anything away from what it is.
So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine? Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense. They must have been rolling around on their office floors pissing themselves with laughter while they were watching the sales figures counting up minute by minute.

And to help your comprehension, nobody has ever said that RTX technology is a bad idea. It’s the way nGreedia have gone about introducing it that has caused all this hatred. You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,291 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
 
Joined
Jan 23, 2012
Messages
374 (0.08/day)
Location
South Africa
Processor Pentium II 400 @ 516MHz
Motherboard AOpen AX6BC EZ
Cooling Stock
Memory 192MB PC-133
Video Card(s) 2x Voodoo 12MB in SLI, S3 Trio64V+
Storage Maxtor 40GB
Display(s) ViewSonic E90
Audio Device(s) Sound Blaster 16
Software Windows 98 SE
So we wait 3 years, or maybe 5 for a completely new game engine to be built from the ground up, then for the game it’s self to be created using this engine? Yeah, the nGreedia is strong with this one.

Let’s face it, nGreedia knew this was going to happen, and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.

Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.

And to help your comprehension, nobody has ever said that RTX technology is a bad idea. It’s the way nGreedia have gone about introducing it that has caused all this hatred. You see, people are not all totally stupid, some of us know when we are being manipulated and lied to.

So how exactly did NVIDIA introduce this that was so wrong in your opinion?
 
Joined
Feb 23, 2017
Messages
157 (0.05/day)
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?

"It just works" a blatant marketing HUANG lvl of LIE
 
Joined
May 6, 2012
Messages
184 (0.04/day)
Location
Estonia
System Name Steamy
Processor Ryzen 7 2700X
Motherboard Asrock AB350M-Pro4
Cooling Wraith Prism
Memory 2x8GB HX429C15PB3AK2/16
Video Card(s) R9 290X WC
Storage 960Evo 500GB nvme
Case Fractal Design Define Mini C
Power Supply Seasonic SS-660XP2
Software Windows 10 Pro
Benchmark Scores http://hwbot.org/user/kinski/ http://valid.x86.fr/qfxqhj https://goo.gl/uWkw7n
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
That seems to suggest theres a memory bottleneck....
 
Joined
Oct 15, 2010
Messages
208 (0.04/day)
I love to say it... I told you so!

nGreedia... The way you're meant to be played! Fools.

Hope you 2080Ti owners enjoy the 1080p 30 to 60FPS gaming on your shiny 4K 144Hz monitors! And what about the experts here saying what value for money the 2070 is? Yeah, real value, right there! An 2070 at 15-35FPS 1080p sounds great for a $600+ “value”.

Can't wait to hear how wrong I am! nGreedia will be sending the software update out to the green NPC army right about now, so that should make this fun.

Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.
 
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?
I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up. I never said anything of the sort. You really are not good with this, you really need to connect to the nGreedia server, and get your software updated... You said that we have to wait for game devs to write an engine that supports RTX from the ground up. I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?
 
Joined
Jan 23, 2012
Messages
374 (0.08/day)
Location
South Africa
Processor Pentium II 400 @ 516MHz
Motherboard AOpen AX6BC EZ
Cooling Stock
Memory 192MB PC-133
Video Card(s) 2x Voodoo 12MB in SLI, S3 Trio64V+
Storage Maxtor 40GB
Display(s) ViewSonic E90
Audio Device(s) Sound Blaster 16
Software Windows 98 SE
I stopped spoon feeding my daughter when she was 2, tell me why I should be spoon feeding you?

Stop making arguments up. I never said anything of the sort. You really are not good with this, you really need to connect to the nGreedia server, and get your software updated... You said that we have to wait for game devs to write an engine that supports RTX from the ground up. I told you the timeframe that would happen in.

I guess you have AMD and Trump to bring up next, followed by a third course of white male privilege and homophobia baiting?

and made sure that there was no RTX content for just long enough to get a few tens of thousands of very expensive cards out there to people with more money than sense.

So then what exactly are you saying here?
 
Joined
Sep 10, 2016
Messages
34 (0.01/day)
Yap, we are still 1 to 2 Generations away from Raytracing implementation on the whole range of Cards doing 1080p@60 with RT enabled even for the lowest end Cards. AMD knows this, ist why they said theyre going to take care or Raytracing only when all their Cards will suport it.

And this is not everything, since you can increase or decrease the number of rays that are to be processed. So even then later on, with 5times the Performance we have now, put enough rays, Performance will still be 30 fps.

Questions is, how much rays are enough ? Perhaps and adaptive algorithm can be used that increases and decreases ray numbers to maintain a target Frame rate would be in order ? Thus lower end Card would use less rays, and higher end Cards more rays for the same Performance Levels. I would rather have something smart and done right like this, then this shitshow we have now. 1400 euro Cards doing 30 fps in 1080p. Pretty much outrageous. You can get a xbox, ps4, nintendo Switch, and and maybe have leftovers for a Laptop for that Money.
For me 0 rays are enough, make separate cards for rtx only tensors.
 

pky

Joined
May 24, 2014
Messages
41 (0.01/day)
RTX 2080 and 2070 data is a bloodbath. They post even bigger %losses than 2080 Ti. ETA on our review is ~1 hr.
I was thinking... if you're using Ray Tracing, do the "regular" shadows/lighting/reflections settings affect visual quality, and do they affect FPS? If they don't affect quality (considering RT takes care of that), but they do affect FPS - maybe lowering them/turning them off would boost FPS without quality loss?
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,049 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Are you honestly saying that devs should make a game for hardware that doesn't exist? Are you REALLY saying that? Please tell me that's not what you're saying.



So how exactly did NVIDIA introduce this that was so wrong in your opinion?
So then what exactly are you saying here?
Well you’re the one trying to justify it in it’s current state despite the facts that it’s not the the amazing miracle tech you continue to claim that it is. What part of its not ready do you need explained to you again? The numbers don’t lie and it’s only going to get worse with the 2080 and 2070. How are you going to try to spin those numbers favourably?
Bottom line the hardware DOESN’T exist.
 
Joined
Feb 3, 2017
Messages
3,812 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos

AMX85

New Member
Joined
May 29, 2018
Messages
23 (0.01/day)
looking all settings performance, looks like an GPU Engine bug, is´nt using all Cores or someting, Drivers and game will fix it, but i still expecting performance lost, maybe 20-30%


greetings
 

GFalsella

New Member
Joined
Nov 14, 2018
Messages
4 (0.00/day)




Actually I'm more inclined to say this is Hairworks 2.0. Barely visible improvements at major performance hit, but don't worry, you can make it super low quality to have decent performance. PhysX on GPU has almost always been pretty flawless and barely or didn't impact FPS at all.
Are you stupid, legally blind or just trolling? I get all the skepticism towards ray tracing, i thinks it's way to early for it too but comparing a totally game changing tech with hairworks saying it's barely visible is not a statement one can take serious. This is more like 2D->3D than hairworks but ok, guess hating is easier. (don't get me wrong, it's right to hate on nvidia giving their shitty business practices but rtx != nvidia.
 
Joined
Jun 15, 2016
Messages
1,042 (0.34/day)
Location
Pristina
System Name My PC
Processor 4670K@4.4GHz
Motherboard Gryphon Z87
Cooling CM 212
Memory 2x8GB+2x4GB @2400GHz
Video Card(s) XFX Radeon RX 580 GTS Black Edition 1425MHz OC+, 8GB
Storage Intel 530 SSD 480GB + Intel 510 SSD 120GB + 2x500GB hdd raid 1
Display(s) HP envy 32 1440p
Case CM Mastercase 5
Audio Device(s) Sbz ZXR
Power Supply Antec 620W
Mouse G502
Keyboard G910
Software Win 10 pro
How about performance @720p it should be OK i guess and next gen cards will do 1080p and after that 3rd gen card will do 1440p and then..
 
Joined
Nov 4, 2005
Messages
12,006 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Why didn't you but two cards? Seriously, did you peons expect this to work with just one? If you can afford one 1300 card that means you can afford 2, and what's money anyway when it comes to having the best of the best so your epeen is huge....
 
Joined
Mar 23, 2012
Messages
570 (0.12/day)
Processor Intel i9-9900KS @ 5.2 GHz
Motherboard Gigabyte Z390 Aorus Master
Cooling Corsair H150i Elite
Memory 32GB Viper Steel Series DDR4-4000
Video Card(s) RTX 3090 Founders Edition
Storage 2TB Sabrent Rocket NVMe + 2TB Intel 960p NVMe + 512GB Samsung 970 Evo NVMe + 4TB WD Black HDD
Display(s) 65" LG C9 OLED
Case Lian Li O11D-XL
Audio Device(s) Audeze Mobius headset, Logitech Z906 speakers
Power Supply Corsair AX1000 Titanium
I feel bad for anyone who got suckered by Nvidia's marketing and bought 20 series specifically for ray tracing.

Only one card in the 20 series - the 2080 Ti - has a real reason to be purchased, and that is specifically for maximum possible 4k rasterized gaming performance. All the 20 series cards are (and always were going to be) hot garbage for ray tracing.

If you want ray tracing, wait 1-3 years for 2nd and 3rd generation GPUs designed for it.
 
Top