• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Is it just me or does the infinity cache(?) seem to help with a more consistent frame rate & a lot less spikes wrt 3080 ~
Yeah IQR values seem to be a bit lower in most games. Not sure if it's the L3 cache, but could be, because it reduces latency for many memory fetches
 
Joined
Mar 20, 2019
Messages
556 (0.28/day)
Processor 9600k
Motherboard MSI Z390I Gaming EDGE AC
Cooling Scythe Mugen 5
Memory 32GB of G.Skill Ripjaws V 3600MHz CL16
Video Card(s) MSI 3080 Ventus OC
Storage 2x Intel 660p 1TB
Display(s) Acer CG437KP
Case Streacom BC1 mini
Audio Device(s) Topping MX3
Power Supply Corsair RM750
Mouse R.A.T. DWS
Keyboard HAVIT KB487L / AKKO 3098 / Logitech G19
VR HMD HTC Vive
Benchmark Scores What's a "benchmark"?
yeah no, if you want 4k at ultra, 10GB won't be enough. DLSS, much like PhysX, will prob be only available on a handful of titles. And it isn't slower, it's about the same while overclocking much more and using much less energy. I also have a Sony X900H 4k@120hz to feed, and a 16GB highly overclockable 6800XT is a much better deal, now and even more in the future.
Well, I have the 2080ti and a 1080ti in my wife's computer. I never saw GPU memory usage above 8GB and it doesn't often go above 5GB, aside from Horizon: Zero Dawn which seems to reserve the whole GPU memory pool immediately after running the game. I also don't have the "ultra" fetish, I care more about the framerate and it seems that the faster 3080's memory works better at high resolutions. Even if it' just 5%, slower is slower and there is no point in spending money on an inferior product when the difference in price is equivalent to a good meal in a restaurant.
 
Joined
Jul 19, 2016
Messages
480 (0.16/day)
This thread confirmed has been bombarded by trolls.
Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...) and somehow ignoring their pride about performance/power consumption/heat.
hello...

oh yes. winter is coming.

Testament to how worried the Nvidia shills are about their cards...power efficiency of Ampere looks awful compared to RDNA2, how the tables have turned :)

Also, RT performance on the level of 2080 Ti is absolutely fantastic, even if RT for me is almost a total irrelevance because of how few games use it. Nice to see though.
 

mlambert890

New Member
Joined
Sep 29, 2020
Messages
3 (0.00/day)
Testament to how worried the Nvidia shills are about their cards...power efficiency of Ampere looks awful compared to RDNA2, how the tables have turned :)

Also, RT performance on the level of 2080 Ti is absolutely fantastic, even if RT for me is almost a total irrelevance because of how few games use it. Nice to see though.

Shills complaining about shills is ridiculous. Performance in the most important new technology on par with the competitors *last gen* isnt impressive. Come on! And once youre arguing "fps/watt" on *enthusiast* forums its a lost battle. Nvidia is in no way "in trouble". At all. The 6800 is decent, and competitive, but objectively slower *where it matters*, is even *more* unbuyable, and is nearly as expensive. And getting the most out of it depends on proprietary tricks that require an unbuyable CPU and a 500 series chipset. Only a true blind fan boy can call this some kind of huge win.

This gen is a mess on *both* sides. Too expensive. Still not significant enough gains. Paper launches. "Team anything" = idiocy

most new games, based on consoles, will have very light RT effects, and will all be using the "AMD standard" as used on consoles. I wouldn't be worried

Thats not how this works at all. There is no "AMD standard on consoles". The XBox is using Direct X 12 DXR just like PCs. The PS5 uses a version of Vulcan API they licensed. ATI supports both by running RT on the cards compute units. RTX is dedicated hardware which also supports DXR (obviously). Why people think this is "all the same because AMD" is a really fundamental misundertanding of platform architecture, APIs and hardware integration. PC will get *Xbox* ports, therefore DXR, therefore compatible with Nvidia with little to no effort. And given Nvidia is far stronger, where there *are* delays, I will bet you money that when the "Nvidia optimized" version follows, it will be noticeably superior. Because the RT hardware is better. And the effects come via the API. And tuning them isnt expensive. And lots of RT enthusiasts *already own NVidia*.
 
Last edited:
Joined
Apr 1, 2015
Messages
25 (0.01/day)
Location
Athens Greece
System Name The Sentinel Reloaded
Processor AMD Ryzen 7 2700X
Motherboard Asus Rog Strix X470-F Gaming
Cooling CoolerMaster MasterLiquid Lite 240
Memory 24GB Patriot Viper (2X8) + HyperX Predator 16GB (2X4) DDR4-3000MHz
Video Card(s) Sapphire Radeon RX 570 4GB Nitro+
Storage WD Μ.2 Black NVME 500Gb, Seagate Barracuda 500GB (2.5"), Seagate Firecuda 2TB (2.5")
Display(s) LG 29UM59-P
Case Lian-Li PC-011 Dynamic Black
Audio Device(s) Onboard
Power Supply Super Flower Leadex II 80 Plus Gold 1000W Black
Mouse Logitech Marathon M705
Keyboard Logitech K330
Software Windows 10 Pro
Benchmark Scores Beyond this Galaxy!
Woe to You Oh Earth (NVidia) and Sea(Intel),

for the Devil (PowerColor Radeon RX Red Devil) sends the beast with wrath,

because he knows the time is short (Cyberpunk 2077 December Launch).

Let him who hath understanding (AMD Fanboys like me..) :oops:

reckon the number of the beast,

for it is a human number..

its number is 6900XT!!!!!!!!!! :rockout::rockout::rockout::clap::clap::peace:
 
Joined
Mar 22, 2019
Messages
461 (0.23/day)
Location
Western NY, USA
Processor AMD Ryzen 7 3700x
Motherboard Asus ROG Strix X470-F Gaming
Cooling Scythe Ninja 5
Memory G.Skill Ripjaws V 16GB (2 x 8GB) (F4-3600C16D-16GVKC) @ 3733 MHz 16-19-19-19-36-56
Video Card(s) MSI RTX 2060 Super Armor OC 8GB
Storage 1x Samsung 970 EVO 500GB / 3x Crucial MX500 / 4 HDDs
Display(s) Dell 23" LCD S2316M
Case Rosewill Rise Glow
Power Supply CORSAIR RM650
Mouse Cooler Master MS120
Keyboard Cooler Master MS120
Software Windows 10 Pro x64
Woe to You Oh Earth (NVidia) and Sea(Intel),

for the Devil (PowerColor Radeon RX Red Devil) sends the beast with wrath,

because he knows the time is short (Cyberpunk 2077 December Launch).

Let him who hath understanding (AMD Fanboys like me..) :oops:

reckon the number of the beast,

for it is a human number..

its number is 6900XT!!!!!!!!!! :rockout::rockout::rockout::clap::clap::peace:
LMAO !!!!
 
Joined
Jan 8, 2009
Messages
549 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
My take away from it is this; is it really a problem? Whether you buy anything from a 3060Ti to a 3090, or a 6800 to 6900XT, they're ALL pretty damn fast.

I won't be losing any sleep over $50 here, or 20% FPS difference there... Next year, then the year after there will be something faster again and I'll buy that.

If you think you're future-proofing at the moment by buying a 3080 - then you're probably mistaken as there are probably going to be some big leaps over the next few years and you can't be worrying about that. If you need an upgrade now, buy one. If you don't then do you just want to splash some cash - and if so, just buy whatever you want.

I am losing my sleep over ray tracing and not 50$..if AMD had a good RT performance, it would have been easy to choose... or I have to wait till Dec 8th to see what 6900XT can do and what 3080TI can do in Jan. I have waited 1 year to replace my Vega64. 2 more months I can surly wait instead of buying the wrong card and regret again..
 
Joined
Mar 18, 2015
Messages
2,960 (0.85/day)
Location
Long Island
1. Regarding the conclusions, I don't really see the 6800 XT ($650) as delivering a 100% generational improvement ... nor was it AMDs fastest card, a designation that better fit the Radeon VII.

2. AMD has achieved as close to parity here as we have seen in a long time .... Figuring that a system w/ the 6800 XT might cost say $1650 at the point the snipers are no longer dictating prices, that equates to a 3080 system costing $1700. Outta the box, that's a 3% increase at 1440p as compared to a 3% increase in price. That's pretty much a wash.

3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.

4. Comparing the OC performance, the 6800 XT "outta the box" did poorly when overclocked, but in "Max Power" mode The 6800 XT did better % wise than the reference 3080, the the 3080 FE still hit the highest OC in fps.

5. The shocker here is that the AMD card hit 78C w/ load and OC (and I assume not Max power) 2C better than the 3080 FE .... and in an even bigger surprise, AMD hit 31 dbA versus 35 dbA .... This is a significant win for AMD. However, again might not be a fair comparison since not sure what operational mode is represented in each graph.

6 Until we see some the AIB cards reviewed, no way to really know which offers the best overclocked performance / power-sound-temps ratio

7. At this point, if ya could purchase either at MSRP, sound arguments could be made for either card .... I always look at price as a secondary factor becase any conclusion drawn based upon "value" is out the window when price adjustments are made. Nvidia would have been foolish to price the card below $700 when they were the only card available. But I wouldn't expect them to match AMDs MSRP until supply catches up w/ demand. I don't expect to be in a position to make a logical decision recommendation until after the holidays.

Still most purchases will be decided by "brand" rather than "the numbers". Features or advantages that one side has will be deemed "don't matter" by the other and visa versa. To my eyes, I prefer to play games using motion blur reduction (ULMB) over adaptive sync and 17 / 23 games in the test suite can use ULMB at 120 Hz on nvidia cards . G-Sync monitors. It also makes Ray Tracing easily obtainable at 60+ fps

In short, H U G E Kudos to AMD here. After not really having a horse in the race other than thr 5600 XT (and lower price segments) last generation, they have achieved parity in the top "consumer gaming" segment, something they have not done since 2012. Hopefully they can continue up and down the line. Anxious to see the reviews on the AIB cards that our users will actually buy next year
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,569 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.
Only the purple bar in the OC gains chart was done at maximum power limit and manual OC, just like it says in the title
In that same chart Rage mode was not used at all, the green bar is the manual OC with power limit at stock (so purple bar with stock power limit)
The Rage mode results in the rest of the review were at everything stock, just Rage mode activated

Under the hood Rage mode is a profile that's stored in the BIOS. It has 4 vendor-defined values that rage mode overrides: power limit, temperature target, rpm target and acoustic limit. Note: no change in clocks or voltage
 
Joined
Mar 23, 2005
Messages
4,079 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
No, Nvidia is not in trouble, AMD is just where they should be, or should have been years ago. Thanks to every single ZEN generation being successful and future ZEN designs are all lined up and ready to go, AMD has given its Radeon Technology Group new life with the success of RDNA2 and with RDNA3 looking very good, with already talks on RDNA4 by speculators. Seems AMD is following its ZEN style of releases, complete design overhauls for each Generation. :peace:
 
Joined
Feb 11, 2009
Messages
5,514 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
See my thinking is that games today have been made using Nvidia's RT implementation because...well its the only option...

And now AMD is out with their take, and performance is a bit lackluster. Now this might be due to just the hardware, but could it be that in the future, when devs build games with RT based on AMD's take on it (also for the consoles) and Nvidia that performance in future games will be better?
 
Last edited:
Joined
Nov 4, 2005
Messages
11,913 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
See my thinking is that games today have been made using Nvidia's RT implementation because...well its the only option...

And now AMD is out with their take, and performance is a bit lackluster. Now this might be due to just the hardware, but could it be that in the future, when devs build games with RT based on AMD's take on it (also for the consoles) and Nvidia that performance in future games will be better?

Much like tesselation I'm sure we will see ray tracing become adjustable and future hardware implementations become faster/less overhead and more shared resources between geometry and ray tracing will make it almost penalty free as game engines, and hardware both work better.

But for a few years either implementation has a performance cost and aren't truly pure ray tracing.
 
Joined
Jul 5, 2013
Messages
26,682 (6.50/day)
bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
This is AMD's first go in RTRT, and while it's respectable, they need to and will improve. Despite the nay-saying and whining by some, RTRT is the future of lighting FX and AMD will improve as they refine.
 
Joined
Sep 17, 2014
Messages
22,009 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
This is AMD's first go in RTRT, and while it's respectable, they need to and will improve. Despite the nay-saying and whining by some, RTRT is the future of lighting FX and AMD will improve as they refine.

If you ask me, they should be improving on the software side, not the hardware side. They balanced out their raster and RT performance per shader quite right for this moment in time, if they can just scale it up in future generations along with shader count/die size, there will be sufficient RT perf on tap. I mean, how much are we prepared to lose over those stupid rays? There is a point of diminishing returns and its not like Nvidia is winning the efficiency crown with their larger reserved die space for RT/Tensor as it is. Part of that is node, but not all of it. Nvidia has a larger die, but is still less efficient (520 vs 620mm2 - 6800XT vs 3080)

In the end its really a balancing act, how much raster perf will you sacrifice per shader to enable ray calculations. Its one or the other, the ideal situation would be a new type of shader that could happily switch between operations. A step closer to a CPU...
 
Joined
Jul 5, 2013
Messages
26,682 (6.50/day)
In the end its really a balancing act, how much raster perf will you sacrifice per shader to enable ray calculations.
IMHO, 100%. RT is far better than raster lighting where quality and realism is concerned. I would personally love to see non-rt lighting disappear.
 
Joined
Sep 17, 2014
Messages
22,009 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
IMHO, 100%. RT is far better than raster lighting where quality and realism is concerned. I would personally love to see non-rt lighting disappear.

But that's just lighting and its not exactly expensive to do that with raster. So you're going to trade something somewhere for high cost raycasting, and may end up with great lighting over shitty environments, low draw distance, heavy LOD, etc.

This is why I'm advocating a slow approach over a fast one. We've seen it already with Turing. Big part of die reserved, high cost for those GPUs, barely a perf/dollar advancement from that gen, and barely a handful of RT titles to use it for.

1. Regarding the conclusions, I don't really see the 6800 XT ($650) as delivering a 100% generational improvement ... nor was it AMDs fastest card, a designation that better fit the Radeon VII.

2. AMD has achieved as close to parity here as we have seen in a long time .... Figuring that a system w/ the 6800 XT might cost say $1650 at the point the snipers are no longer dictating prices, that equates to a 3080 system costing $1700. Outta the box, that's a 3% increase at 1440p as compared to a 3% increase in price. That's pretty much a wash.

3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.

4. Comparing the OC performance, the 6800 XT "outta the box" did poorly when overclocked, but in "Max Power" mode The 6800 XT did better % wise than the reference 3080, the the 3080 FE still hit the highest OC in fps.

5. The shocker here is that the AMD card hit 78C w/ load and OC (and I assume not Max power) 2C better than the 3080 FE .... and in an even bigger surprise, AMD hit 31 dbA versus 35 dbA .... This is a significant win for AMD. However, again might not be a fair comparison since not sure what operational mode is represented in each graph.

6 Until we see some the AIB cards reviewed, no way to really know which offers the best overclocked performance / power-sound-temps ratio

7. At this point, if ya could purchase either at MSRP, sound arguments could be made for either card .... I always look at price as a secondary factor becase any conclusion drawn based upon "value" is out the window when price adjustments are made. Nvidia would have been foolish to price the card below $700 when they were the only card available. But I wouldn't expect them to match AMDs MSRP until supply catches up w/ demand. I don't expect to be in a position to make a logical decision recommendation until after the holidays.

Still most purchases will be decided by "brand" rather than "the numbers". Features or advantages that one side has will be deemed "don't matter" by the other and visa versa. To my eyes, I prefer to play games using motion blur reduction (ULMB) over adaptive sync and 17 / 23 games in the test suite can use ULMB at 120 Hz on nvidia cards . G-Sync monitors. It also makes Ray Tracing easily obtainable at 60+ fps

In short, H U G E Kudos to AMD here. After not really having a horse in the race other than thr 5600 XT (and lower price segments) last generation, they have achieved parity in the top "consumer gaming" segment, something they have not done since 2012. Hopefully they can continue up and down the line. Anxious to see the reviews on the AIB cards that our users will actually buy next year

Well spoken sir !
 
Joined
Jul 5, 2013
Messages
26,682 (6.50/day)
So you're going to trade something somewhere for high cost raycasting, and may end up with great lighting over shitty environments, low draw distance, heavy LOD, etc.
No, what I'm saying is that as hardware performance improvements are made and optimizations in software are made, the hit to performance will become a non-issue to the point were RT lighting is very much preferred. Also there are varying degrees to which RT lighting can be applied currently, and to great effect. Improvements will only continue.
 
Joined
Apr 24, 2020
Messages
2,696 (1.67/day)
Hmmm, precomputing lighting clearly works and works well for static lights.

Where RT lights come in are:

1. Less work from the developer
2. Less precompiling
3. Dynamic and/or moving lights can become possible (including reflected lights off of moving surfaces).

Those are the things that were demonstrated with the extra-shiny Stormtrooper demo (
).

It certainly adds an element of realism. But at the same time, there's an element of over-realism. Because we've never seen dynamic lighting in video games before, such demos are overemphasizing them... kinda like how "Wizard of Oz" overemphasized the cartoony colors of color-TV back when color-TV became a thing.

We will need a few generations of video games before we see "realism". For now, we'll see overly shiny cars and overly-shiny helmets that don't really have a realistic atmosphere. Until the artists figure out to use raytracing correctly anyway... I really find a lot of the recent demos to be ridiculous.

--------

The best lighting is lighting that you don't notice. Lighting that sets the mood, provides contrast, and draws the eye towards the important elements of the screen. Lighting doesn't necessarily have to be "realistic". Lightning just has to set the mood correctly.

See Pulp Fiction:

1606150550030.png


The light is behind Samuel Jackson's afro, a very unrealistic position when you consider what is going on in the room. But the lighting draws your attention to the scene (the guns, the faces, etc. etc.) while drawing your eye away from the background. That's cinematography right there: not necessarily being "realistic", but using lights to accomplish a goal... a way for the director to communicate with their audience.

A "realistic" light setup for that room would be dimly lit from only the window in the background. Its clear that the room doesn't have any lights in it, so why can we clearly see their faces? Well, that's cinematography done right. It doesn't worry about the details, its #1 goal is communication with the audience. Realism be damned.

--------

Video Game lighting gets better and better, and more realistic. But video game directors still don't know how to use lighting to communicate well. At some point, we have to recognize that its the video game art direction that's a problem as opposed to technology.
 
Last edited:
Joined
Apr 24, 2020
Messages
2,696 (1.67/day)
I can't say I've played Control. But it seems like a good discussion point.

1606151675423.png


Lets look at this screen: why is the floor shiny? Well, its a good demonstration of a RTRT effect, but... does the shininess of the floor really represent anything? (Aside from "My GPU can play this game and your GPU can't").

1606151761258.png


Why is there a giant shiny mirror in this room?

-----------

Technology demos are cool (and best demonstrated with something like Minecraft RT or Quake RT). But I feel like the next generation of video games needs to start thinking about the "interpretation" of the language of lighting effects. I still feel like this generation of video games errs on the side of 'Cool Graphics Demo' instead of actual cinematography.

With that being said: I'm looking at some screenshots of Control, and it seems like someone is thinking of cinematography.


This screenshot is pretty good: the reflection on the computer monitor draws your eyes towards it. Its a good use of reflective technology, and sets the mood very well.
 
Joined
Jul 5, 2013
Messages
26,682 (6.50/day)
Lets look at this screen: why is the floor shiny? Well, its a good demonstration of a RTRT effect, but... does the shininess of the floor really represent anything? (Aside from "My GPU can play this game and your GPU can't").
Ah, but you're missing a slight point. Environmental lighting FX can be used as movement cues, IE enemies or object in the environment approaching and you see their approach before you actually see them. With RT lighting, this effect happens naturally, as it would IRL. But with non-RT lighting, creating that effect would be a serious task and create a great deal of system resource over-head as has been shown in a mutitude of past games that attempted it(to various levels of success).

And for the record, there are many business buildings in the world that have very shiny and reflective floors like what is shown in the picture above. It's not unusual.
 
Joined
Apr 24, 2020
Messages
2,696 (1.67/day)
And for the record, there are many business buildings in the world that have very shiny and reflective floors like what is shown in the picture above. It's not unusual.

I think where I'm going is that super-clean and shiny floors are unusual in the real world (as well as in cinema). Even if we were to go to a high-end business building scene, such as the "Lobby Shootout" scene from the Matrix... the floors are marble: matte and non-shiny. (
)

Honestly, the only time a shiny floor happened in cinema to my memory is the "Little Princess" scene, where she's mopping a floor. You can see the footsteps of the children in the floor she just mopped: the shininess explicitly calling out how much the other children don't care about the main character anymore.

1606155331408.png


I'm sure there are other ones. But... yeah, even looking back at "The Matrix" lobby shootout: that was a matte floor. Shininess is actually pretty rare in cinema and the real world. Its overrepresented in modern RTRT games.

I do admit that "The Lobby Shootout" in the matrix had shiny columns (granite??) in the background. So shininess can be used to accent special scenes like that: but it shouldn't be used so willy-nilly as today's RTRT demos are doing.
 
Joined
Jul 5, 2013
Messages
26,682 (6.50/day)
Once again, we're at an impasse. However, the point was that AMD's first go at RTRT is solid, if less efficient than NVidia's latest offerings. The landscape is rosy for RTRT going forward.
 
Joined
Feb 24, 2009
Messages
2,922 (0.51/day)
Location
Riverside, California
Processor AMD Ryzen 7 7800X3D
Motherboard AsRock X670E Phantom Gaming Lightning
Cooling Be Quiet! Dark Rock 4
Memory G.SKILL Trident Z5 Neo DDR5-6000 32GB (2 x 16GB)
Video Card(s) Sapphire Radeon RX 7900 XTX
Storage Samsung 980 PRO Series 1TB, Samsung 980 PRO Series 1TB, Crucial P3 NVMe M.2 2TB
Display(s) LG OLED55G2PUA
Case Lian Li O11 Dynamic XL ROG Certified
Audio Device(s) Digital out to high end dac and amps.
Power Supply EVGA GQ 1000W
Mouse Logitech G600
Keyboard Logitech G413 Carbon
VR HMD Oculus Rift CV1, Oculus Rift S, Quest 2, Quest 3
Software Windows 10 Pro
@$650 I may as well just buy another nvidia for $49.99 more. I had really hoped they would come in better priced to temp me.
 
Joined
Oct 15, 2010
Messages
951 (0.19/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
@$650 I may as well just buy another nvidia for $49.99 more. I had really hoped they would come in better priced to temp me.

You were never going to buy AMD. You expect an equal performance gpu that consumes less power, has 6GB vram more and overclocks better for what? $100 less? wtf are you smoking dude. I'm waiting for the best $200 - $250 gpu from whatever company.
One problem i saw with low mid range last gen was AMD did not have any gpu at $220 - $240 while nvidia had 1660, 1660 super and 1660 ti, the next amd was 5600xt at $280.
It would be awesome if a gpu with 5700/2600 super performance is released at $220.
 
Top