• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V with GeForce RTX DirectX Raytracing

Joined
May 2, 2017
Messages
7,762 (2.83/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
@W1zzard I asked @btarunr this in the original news post thread, but it doesn't seem to have made it through the noise, so I'll try again:

Ca you please do some power/clock speed monitoring while running RTX loads? Given that all RTX cards use 100% of their power targets doing rasterized rendering, I'm wondering just how much of this performance drop is due to power throttling of CUDA cores to allow the RT cores to work; and as a consequence, also how much power the RT cores require for their current performance levels. This would be an invaluable insight, and far more interesting than just pure performance numbers.

You wouldn't even have to do this across cards - just do simple a/b comparison with RTX on and off on a single card at a single detail level.
 
Joined
Mar 31, 2009
Messages
14 (0.00/day)
Processor Intel Coffee Lake, Core i9 9900K 3.6GHz box
Motherboard ASUS PRIME Z390-A
Cooling Noctua NH-D15
Memory 2x Corsair Vengeance LPX Black 32GB DDR4 3000MHz CL16 Dual Channel Kit
Video Card(s) GIGABYTE GeForce RTX 2080 Windforce OC 8GB GDDR6 256-bit
Storage SSD Samsung 970 EVO 500GB PCI Express 3.0 x4 M.2 2280
Display(s) DELL Gaming S2716DG 27 inch 2K 1ms Black G-Sync 144Hz
Case NZXT H440 Matte Black Green New Edition
Audio Device(s) Creative Sound Blaster AE-9
Power Supply Corsair RMi Series RM1000i 1000W, 80 PLUS Gold
Mouse Logitech G604 Lightspeed Wireless
Keyboard Corsair K70 RGB MK.2 Rapidfire Cherry MX Speed
VR HMD Oculus Quest 2
Just upgraded to 1809 and bought bf5 especially to test out this new feature(RTX). It's impressive as it adds a lot to scene realism. I game at 1080p with an rtx 2080 so I was able to set the RTX to ultra. I didn't see any noticeable framerate decrease.

I like it how some people agree and some people disagree with this new tech, each have their arguments but especially those who disagree put a lot of frustration in describing their disagreement.

I can't say it's worth the money since it's only some eye candy added on top and as someone else mentioned it's a fast paced shooter, you wouldn't be able to tell the difference.

All in all I disagree with the price and how it was introduced but am also for evolution of technology. Is it going in the right direction? Only time will tell, and passionate engineers and scientists. As a personal opinion I like it and I wish it will be used in more titles, where it could really make a difference.
 
Joined
Nov 20, 2012
Messages
422 (0.10/day)
Location
Hungary
System Name masina
Processor AMD Ryzen 5 3600
Motherboard ASUS TUF B550M
Cooling Scythe Kabuto 3 + Arctic BioniX P120 fan
Memory 16GB (2x8) DDR4-3200 CL16 Crucial Ballistix
Video Card(s) Radeon Pro WX 2100 2GB
Storage 500GB Crucial MX500, 640GB WD Black
Display(s) AOC C24G1
Case SilentiumPC AT6V
Power Supply Seasonic Focus GX 650W
Mouse Logitech G203
Keyboard Cooler Master MasterKeys L PBT
Software Win 10 Pro
Just upgraded to 1809 and bought bf5 especially to test out this new feature(RTX). It's impressive as it adds a lot to scene realism. I game at 1080p with an rtx 2080 so I was able to set the RTX to ultra. I didn't see any noticeable framerate decrease.

Unless you are on a 30Hz panel with V-Sync on, I highly doubt that...
 
Joined
May 2, 2017
Messages
7,762 (2.83/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Unless you are on a 30Hz panel with V-Sync on, I highly doubt that...
Yeah, dropping from 138 to 58 fps ought to be noticeable no matter your display. Even on a 60Hz display you'd notice a clear drop in fluidity and smoothness. I should know; I play Rocket League on a 60Hz display, locked to 120fps it's far smoother than at 60fps, even if half of those frames are discarded.
 
Joined
Nov 13, 2007
Messages
10,683 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
New tech is always great. Personally, I'm glad i held off this round; but Ill definitely enjoy it once they get the kinks out.

I get that it's a technological marvel and whatnot, but from a gaming point of view, and after watching a boatload of "on vs off" videos - it's just another eye candy element - and a rather subtle one at that.

If I owned one of these cards, and I was playing this game at 1440P i would leave it in the 'off' position.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
@W1zzard I asked @btarunr this in the original news post thread, but it doesn't seem to have made it through the noise, so I'll try again:

Ca you please do some power/clock speed monitoring while running RTX loads? Given that all RTX cards use 100% of their power targets doing rasterized rendering, I'm wondering just how much of this performance drop is due to power throttling of CUDA cores to allow the RT cores to work; and as a consequence, also how much power the RT cores require for their current performance levels. This would be an invaluable insight, and far more interesting than just pure performance numbers.

You wouldn't even have to do this across cards - just do simple a/b comparison with RTX on and off on a single card at a single detail level.

I was interested in knowing that too. So far this is the closest answer

TechSpot said:
It is interesting to note across these tests that we are being RT core limited here. The higher the resolution, the higher the performance hit using the Ultra DXR mode, to the point where playing at 4K is more than 4x faster with DXR off. This also plays out when we spot checked power consumption: the cards were running at consistently lower power with DXR on, because the regular CUDA cores are being underutilized at such a low framerate.
 
Joined
May 9, 2012
Messages
8,508 (1.86/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb Corsair Vengeance Pro 3600mhz DDR4/8gb DDR3 1600/2gb LPDDR3/8gb LPDDR5x 4200/16gb LPDDR5
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/RDNA3 768 core
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/2tb SN770M
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Moondrop Chu II + TRN BT20S
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
If I was playing this game at 1440P i would leave it in the 'off' position.
so do i but.... not with a RTX 20XX... rather with a Vega 64 given how it's dropping to my actual 1070 price (520ish $ at the time) and how a 2070 is also overpriced for no reason (650-790$ for me ) and how little difference there is in performance (RTX off ... ofc) while the price difference is around 100 to 150$ ...



ah, 1 month and a half to wait ... harsh end of year for me :laugh:
 
Joined
Nov 13, 2007
Messages
10,683 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
The sad thing is, the amount of technology this takes... but humans generally overlook shadows & lighting - we don't wander around the real world paying attention to light interactions and shadows, if anything, we actively ignore those things so we can focus on what we're doing.

When a game is below the FPS level that I want, shadow detail and occlusion effects are the first things that get turned down. I feel like to most people they just don't make a huge difference - as long as they're there and in semi-realistic detail we pay about as much attention to them in game as we do in the real world - as little as humanly possible.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.37/day)
I don't get it. Why is the performance taking a hit. Any hit whatsoever. Because those RT cores are either making all the other cores sit idle 2/3 of the time, and we need 3x more of them to make up for the lack of RT power or they are only doing part of the job for RTRT and are offloading the rest to the Cuda cores. This is not good.
Turns out that DLSS on 4K is actually 1440p with upscaling and they call that a 35% performance increase when it is actually a drop, because going from 4K to 1440p should yield 100% performance increase. It is not doing DLSS for free, offloading all the work to the tensor cores, but it is cheating. Anyways I'm using 2070 instead of 1080 with RTX OFF DLSS OFF for the time being lol.
 
Joined
Dec 18, 2005
Messages
8,253 (1.20/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
i just bought an "upgrade" card.. it just happened to have ray tracing.. a single 2080ti.. coming from a pair of 1070 cards in sli i only had a couple of options.. a single 2080ti or a pair of 1080ti cards in sli..

no way on this planet did i buy anything because it had ray tracing abilities.. i dont know how i fit in the general scheme of things as regards ray tracing but for me any upgrade option would have cost a lot ray tracing or not..

my next gpu upgrade will also cost a lot.. it will be another 2080ti to match the one i already have..

trog
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.37/day)
No trog you don't. When 7nm hits the market, you sell the 2080Ti for it is much slower than a ~~70 nextgen. just like 1070 and 980Ti. 970 beat the first TITAN Kepler.

The single RTX 3070 will make RTX TITAN 4608 Core 12GB Edition look like a toy, and buy a single card 3080 Ti.
 
Joined
Apr 12, 2017
Messages
147 (0.05/day)
System Name Dell Dimension P120
Processor Intel Pentium 120 MHz 60Mhz FSB
Motherboard Dell Pentium
Memory 24 MB EDO
Video Card(s) Matrox Millennium 2MB
Storage 1 GB EIDE HDD
Display(s) Dell 15 inch crt
Case Dell Dimension
Audio Device(s) Sound Blaster
Mouse Microsoft mouse, no scroll wheel
Keyboard Dell 1995
Software Windows 95 + Office 95
If I'd pay $1.300 for GPU, I'd expect flawless RTX gaming at 4K not a slideshow. Maybe RTX 4080TI can deliver.
by then you will have to pay 4445 for the RTX 4080Ti, assuming the msrp will increase by 85% for each generation as it did when rtx 2080ti was released...
gtx 1080ti msrp 699 usd
rtx 2080ti msrp 1299 usd
rtx 3080ti msrp 2405 usd
rtx 4080ti msrp 4445 usd

I bet it will be sold out before it was released
 
Joined
Jul 19, 2016
Messages
481 (0.16/day)
The sad thing is, the amount of technology this takes... but humans generally overlook shadows & lighting - we don't wander around the real world paying attention to light interactions and shadows, if anything, we actively ignore those things so we can focus on what we're doing.

When a game is below the FPS level that I want, shadow detail and occlusion effects are the first things that get turned down. I feel like to most people they just don't make a huge difference - as long as they're there and in semi-realistic detail we pay about as much attention to them in game as we do in the real world - as little as humanly possible.

That's it, this is why RT on these cards right now is quite frankly, pathetic. Halves performance all for the sum total of....better reflections. And here's the kicker: you won't notice them as you stomp around in multiplayer. Worse; you'll very likely turn it off!

Hence Nvidia trying to charge a premium for RT is pathetic too.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,978 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
No trog you don't. When 7nm hits the market, you sell the 2080Ti for it is much slower than a ~~70 nextgen. just like 1070 and 980Ti. 970 beat the first TITAN Kepler.
Wow, still peddling your BS, huh? Your example of the 1070 beating the 980Ti was a one-off, a singular event.

Two, you dont ever advise someone to keep buying down from whatever tier they are at. What happens is that in 2026 they end up with an RTX5030 entry level card that of course will decimate any of today’s games, but can only play new games that year in a slideshow.
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Any word yet on the scene/method tpu tested? I feel more annoying than normal asking for a 3rd time... but how many days should one wait for a simple answer? Why does this feel like it is some kind of secret?
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,978 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Any word yet on the scene/method tpu tested? I feel more annoying than normal asking for a 3rd time... but how many days should one wait for a simple answer? Why does this feel like it is some kind of secret?
That’s actually a good request. People can compare their own results, as well as account for differences or similarities with other review sites.
 
Joined
Feb 6, 2017
Messages
9 (0.00/day)
System Name Workhorse
Processor Ryzen 1800X
Motherboard Asus Crosshair VI Hero
Cooling Noctua NH-D15
Memory 32GB
Video Card(s) Gainward GTX1080
Storage 3xSSDs, 2xHDD
Display(s) PG279Q
Case NZXT H2
Audio Device(s) Powercolor Devil HDX
Power Supply Dark Power Pro 10
Mouse G700
Keyboard Logitech Illuminated Kb
we weren't aware DICE added another setting called "DXR reflections quality," which by default was set to "Ultra" (other settings include Low, Medium, and High).

Do I get it right, the lower than Ultra settings were not tested?
So maybe "DXR Reflections Quality: Medium" looks OK with 60% less performance penalty?
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.37/day)
Lol the regular CUDA cores are under utilised and power consumption falls during RT ON, because not enough RT cores are present in the chip. They need 4 RT cores per SM of 64 CUDA cores, not just 1 RT core /SM64 as it currently is. Well that was a bad move Nvidia.
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
That’s actually a good request. People can compare their own results, as well as account for differences or similarities with other review sites.
Right. This isn't a canned benchmark so nearly everyone is doing it differently I would imagine.

I used the @wiz thing twice... no response though he posted in threads after. I know he is extremely busy which is why I don't want to ping him again... yet, I, and am sure many others, would still like an answer.

Part of the reason I am asking is because in the SP mode, there are 3 campaigns which, in my testing, the first two (Under No Flag and Nordlys) aren't very hard on the card with RT enabled. In fact, With Ultra settings and RT enabled, I pulled over 60 FPS in them with a RTX 2070. The 3rd campaign (Tirailleur) in the forest with the water on ground KILLS the card (around 30 FPS). So I am wondering how he got those numbers. It LOOKS like it is an average of the three??? I don't know if....

A. My testing is off...
B. How this testing is done in the first place... all 3 scenes and an average?


The other thing is, I can't even play the god damn game now. I swapped GPUs and it doesn't work. I double click the icon, the bfv.exe starts in task manager, gets to around 216MB and quits. Was on chat with EA for over an hour yesterday and they escalated the issue. I can't even friggin play the game now........ wek sos. I recall W1z mentioning something about swapping parts and limits, but, I don't have a message or anything and one would think ONE of the 3 EA reps I chatted with would have picked that out as I intentionally mentioned it in each chat so they were aware.

EDIT: Put the 2070 back in and it works... WTF?!!!

we weren't aware DICE added another setting called "DXR reflections quality," which by default was set to "Ultra" (other settings include Low, Medium, and High).

Do I get it right, the lower than Ultra settings were not tested?
So maybe "DXR Reflections Quality: Medium" looks OK with 60% less performance penalty?
If you look at the results, you will see low/med/high. Pretty sure that is the RT tested there.

I don't know, so much confuses me (maybe its only me) in how this was actually tested here at TPU.......
 
Last edited:
Joined
Oct 15, 2018
Messages
43 (0.02/day)
Location
EU
Processor Ryzen 1700 @3.8
Motherboard Asus Crosshair 6 Hero
Cooling Corsair H100i v1
Memory 16GB G.Skill F4-3200C14-8GFX
Video Card(s) Asus R9 380 4GB
Storage Samsung 840EVO 250GB, Crucial MX500 500GB, 2xWD Black 2T
Display(s) Benq 24" 144Hz 1080p
Case Antec P280
Power Supply Corsair AXi 860
Mouse Logitech G402
Keyboard Logitech G110
Software Win10 Pro
Nice review! Cool tech, won´t pay extra for it though.
On the game side of things, immersion is ruined in those screenshots instantly with realistic reflections that makes all the models look bad.
 
Joined
Sep 17, 2014
Messages
22,313 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
In non-RTRT gaming you get that. Going from the 1080 to 2080 I got an instant 40-50% boost in performance in my existing library of games. Not everyone is buying the Ti model. I spent much less that $1000 for my 2080 and offset that cost with the sale of my 1080. Keep things in the proper perspective and you'll see the big picture.

RTRT is brand new and it will continue to advance and evolve. In the mean time, non-RTRT gaming is getting big boosts in performance.

Ok then, don't.

Yes they do..

Thanks for the not to subtle insult. What I am happy with is what I mentioned just above, the big boost the 2080 gives to all existing games. I'm also happy to be an early adopter for this run of GPU's because I understand in reasonable detail how RTRT works and what is has to offer the future of gaming.

Your opinion. Not everyone agrees. The benchmarks in this very review do not support that statement.

Your 2080 "big boost" was already on the shelves for two years, they called it a 1080ti and it was cheaper. It seems your 'big picture' needs to get a bit bigger than it is.

you have it back to front.. the so called high prices are all about shareholders and stock prices.. the company has no obligation to its customers only in the sense it needs them as cash cows.. if they get it wrong and lose the cash cows they have f-cked up.. time will tell on that one..

trog

Spot on! That is why many cash cows are now up in arms against RTX. See, you do get it, it just takes awhile.
 
Joined
Jul 5, 2013
Messages
27,430 (6.62/day)
Your 2080 "big boost" was already on the shelves for two years, they called it a 1080ti and it was cheaper. It seems your 'big picture' needs to get a bit bigger than it is.
Maybe, but it didn't have RTRT. And that is something I'm excited about and looking forward too!
 
Joined
Jan 8, 2017
Messages
9,391 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
And that is something I'm excited about and looking forward too!

I'm sure you are, you bought a 2080 after all. :rolleyes:

Your militant praise is staggering , after entire pages your still here replying to everyone with the same standard response, that RTX is great and we aren't enlightened enough to realize this astonishing feat that Nvidia have brought over to us.

Whatever floats your boat but you're wasting your time doing that, literally no one believes you. Not that your words surprise me, few would have the boldness required to admit that the rather expensive product that they bought isn't as stellar as they initially thought. Your determination to protect your purchase to the bitter end is admirable though.
 
Top