• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Is it just me or does the infinity cache(?) seem to help with a more consistent frame rate & a lot less spikes wrt 3080 ~
Yeah IQR values seem to be a bit lower in most games. Not sure if it's the L3 cache, but could be, because it reduces latency for many memory fetches
 
yeah no, if you want 4k at ultra, 10GB won't be enough. DLSS, much like PhysX, will prob be only available on a handful of titles. And it isn't slower, it's about the same while overclocking much more and using much less energy. I also have a Sony X900H 4k@120hz to feed, and a 16GB highly overclockable 6800XT is a much better deal, now and even more in the future.
Well, I have the 2080ti and a 1080ti in my wife's computer. I never saw GPU memory usage above 8GB and it doesn't often go above 5GB, aside from Horizon: Zero Dawn which seems to reserve the whole GPU memory pool immediately after running the game. I also don't have the "ultra" fetish, I care more about the framerate and it seems that the faster 3080's memory works better at high resolutions. Even if it' just 5%, slower is slower and there is no point in spending money on an inferior product when the difference in price is equivalent to a good meal in a restaurant.
 
This thread confirmed has been bombarded by trolls.
Clearly, DLSS is exclusive NV trademark, comparing RT performance (20 games? hello...) and somehow ignoring their pride about performance/power consumption/heat.
hello...

oh yes. winter is coming.

Testament to how worried the Nvidia shills are about their cards...power efficiency of Ampere looks awful compared to RDNA2, how the tables have turned :)

Also, RT performance on the level of 2080 Ti is absolutely fantastic, even if RT for me is almost a total irrelevance because of how few games use it. Nice to see though.
 
Testament to how worried the Nvidia shills are about their cards...power efficiency of Ampere looks awful compared to RDNA2, how the tables have turned :)

Also, RT performance on the level of 2080 Ti is absolutely fantastic, even if RT for me is almost a total irrelevance because of how few games use it. Nice to see though.

Shills complaining about shills is ridiculous. Performance in the most important new technology on par with the competitors *last gen* isnt impressive. Come on! And once youre arguing "fps/watt" on *enthusiast* forums its a lost battle. Nvidia is in no way "in trouble". At all. The 6800 is decent, and competitive, but objectively slower *where it matters*, is even *more* unbuyable, and is nearly as expensive. And getting the most out of it depends on proprietary tricks that require an unbuyable CPU and a 500 series chipset. Only a true blind fan boy can call this some kind of huge win.

This gen is a mess on *both* sides. Too expensive. Still not significant enough gains. Paper launches. "Team anything" = idiocy

most new games, based on consoles, will have very light RT effects, and will all be using the "AMD standard" as used on consoles. I wouldn't be worried

Thats not how this works at all. There is no "AMD standard on consoles". The XBox is using Direct X 12 DXR just like PCs. The PS5 uses a version of Vulcan API they licensed. ATI supports both by running RT on the cards compute units. RTX is dedicated hardware which also supports DXR (obviously). Why people think this is "all the same because AMD" is a really fundamental misundertanding of platform architecture, APIs and hardware integration. PC will get *Xbox* ports, therefore DXR, therefore compatible with Nvidia with little to no effort. And given Nvidia is far stronger, where there *are* delays, I will bet you money that when the "Nvidia optimized" version follows, it will be noticeably superior. Because the RT hardware is better. And the effects come via the API. And tuning them isnt expensive. And lots of RT enthusiasts *already own NVidia*.
 
Last edited:
Woe to You Oh Earth (NVidia) and Sea(Intel),

for the Devil (PowerColor Radeon RX Red Devil) sends the beast with wrath,

because he knows the time is short (Cyberpunk 2077 December Launch).

Let him who hath understanding (AMD Fanboys like me..) :oops:

reckon the number of the beast,

for it is a human number..

its number is 6900XT!!!!!!!!!! :rockout::rockout::rockout::clap::clap::peace:
 
Woe to You Oh Earth (NVidia) and Sea(Intel),

for the Devil (PowerColor Radeon RX Red Devil) sends the beast with wrath,

because he knows the time is short (Cyberpunk 2077 December Launch).

Let him who hath understanding (AMD Fanboys like me..) :oops:

reckon the number of the beast,

for it is a human number..

its number is 6900XT!!!!!!!!!! :rockout::rockout::rockout::clap::clap::peace:
LMAO !!!!
 
My take away from it is this; is it really a problem? Whether you buy anything from a 3060Ti to a 3090, or a 6800 to 6900XT, they're ALL pretty damn fast.

I won't be losing any sleep over $50 here, or 20% FPS difference there... Next year, then the year after there will be something faster again and I'll buy that.

If you think you're future-proofing at the moment by buying a 3080 - then you're probably mistaken as there are probably going to be some big leaps over the next few years and you can't be worrying about that. If you need an upgrade now, buy one. If you don't then do you just want to splash some cash - and if so, just buy whatever you want.

I am losing my sleep over ray tracing and not 50$..if AMD had a good RT performance, it would have been easy to choose... or I have to wait till Dec 8th to see what 6900XT can do and what 3080TI can do in Jan. I have waited 1 year to replace my Vega64. 2 more months I can surly wait instead of buying the wrong card and regret again..
 
1. Regarding the conclusions, I don't really see the 6800 XT ($650) as delivering a 100% generational improvement ... nor was it AMDs fastest card, a designation that better fit the Radeon VII.

2. AMD has achieved as close to parity here as we have seen in a long time .... Figuring that a system w/ the 6800 XT might cost say $1650 at the point the snipers are no longer dictating prices, that equates to a 3080 system costing $1700. Outta the box, that's a 3% increase at 1440p as compared to a 3% increase in price. That's pretty much a wash.

3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.

4. Comparing the OC performance, the 6800 XT "outta the box" did poorly when overclocked, but in "Max Power" mode The 6800 XT did better % wise than the reference 3080, the the 3080 FE still hit the highest OC in fps.

5. The shocker here is that the AMD card hit 78C w/ load and OC (and I assume not Max power) 2C better than the 3080 FE .... and in an even bigger surprise, AMD hit 31 dbA versus 35 dbA .... This is a significant win for AMD. However, again might not be a fair comparison since not sure what operational mode is represented in each graph.

6 Until we see some the AIB cards reviewed, no way to really know which offers the best overclocked performance / power-sound-temps ratio

7. At this point, if ya could purchase either at MSRP, sound arguments could be made for either card .... I always look at price as a secondary factor becase any conclusion drawn based upon "value" is out the window when price adjustments are made. Nvidia would have been foolish to price the card below $700 when they were the only card available. But I wouldn't expect them to match AMDs MSRP until supply catches up w/ demand. I don't expect to be in a position to make a logical decision recommendation until after the holidays.

Still most purchases will be decided by "brand" rather than "the numbers". Features or advantages that one side has will be deemed "don't matter" by the other and visa versa. To my eyes, I prefer to play games using motion blur reduction (ULMB) over adaptive sync and 17 / 23 games in the test suite can use ULMB at 120 Hz on nvidia cards . G-Sync monitors. It also makes Ray Tracing easily obtainable at 60+ fps

In short, H U G E Kudos to AMD here. After not really having a horse in the race other than thr 5600 XT (and lower price segments) last generation, they have achieved parity in the top "consumer gaming" segment, something they have not done since 2012. Hopefully they can continue up and down the line. Anxious to see the reviews on the AIB cards that our users will actually buy next year
 
3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.
Only the purple bar in the OC gains chart was done at maximum power limit and manual OC, just like it says in the title
In that same chart Rage mode was not used at all, the green bar is the manual OC with power limit at stock (so purple bar with stock power limit)
The Rage mode results in the rest of the review were at everything stock, just Rage mode activated

Under the hood Rage mode is a profile that's stored in the BIOS. It has 4 vendor-defined values that rage mode overrides: power limit, temperature target, rpm target and acoustic limit. Note: no change in clocks or voltage
 
No, Nvidia is not in trouble, AMD is just where they should be, or should have been years ago. Thanks to every single ZEN generation being successful and future ZEN designs are all lined up and ready to go, AMD has given its Radeon Technology Group new life with the success of RDNA2 and with RDNA3 looking very good, with already talks on RDNA4 by speculators. Seems AMD is following its ZEN style of releases, complete design overhauls for each Generation. :peace:
 
bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
See my thinking is that games today have been made using Nvidia's RT implementation because...well its the only option...

And now AMD is out with their take, and performance is a bit lackluster. Now this might be due to just the hardware, but could it be that in the future, when devs build games with RT based on AMD's take on it (also for the consoles) and Nvidia that performance in future games will be better?
 
Last edited:
bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
See my thinking is that games today have been made using Nvidia's RT implementation because...well its the only option...

And now AMD is out with their take, and performance is a bit lackluster. Now this might be due to just the hardware, but could it be that in the future, when devs build games with RT based on AMD's take on it (also for the consoles) and Nvidia that performance in future games will be better?

Much like tesselation I'm sure we will see ray tracing become adjustable and future hardware implementations become faster/less overhead and more shared resources between geometry and ray tracing will make it almost penalty free as game engines, and hardware both work better.

But for a few years either implementation has a performance cost and aren't truly pure ray tracing.
 
bit late to the party, but anyone have some info about possible performance in RT improvemetns in the future?
This is AMD's first go in RTRT, and while it's respectable, they need to and will improve. Despite the nay-saying and whining by some, RTRT is the future of lighting FX and AMD will improve as they refine.
 
This is AMD's first go in RTRT, and while it's respectable, they need to and will improve. Despite the nay-saying and whining by some, RTRT is the future of lighting FX and AMD will improve as they refine.

If you ask me, they should be improving on the software side, not the hardware side. They balanced out their raster and RT performance per shader quite right for this moment in time, if they can just scale it up in future generations along with shader count/die size, there will be sufficient RT perf on tap. I mean, how much are we prepared to lose over those stupid rays? There is a point of diminishing returns and its not like Nvidia is winning the efficiency crown with their larger reserved die space for RT/Tensor as it is. Part of that is node, but not all of it. Nvidia has a larger die, but is still less efficient (520 vs 620mm2 - 6800XT vs 3080)

In the end its really a balancing act, how much raster perf will you sacrifice per shader to enable ray calculations. Its one or the other, the ideal situation would be a new type of shader that could happily switch between operations. A step closer to a CPU...
 
In the end its really a balancing act, how much raster perf will you sacrifice per shader to enable ray calculations.
IMHO, 100%. RT is far better than raster lighting where quality and realism is concerned. I would personally love to see non-rt lighting disappear.
 
IMHO, 100%. RT is far better than raster lighting where quality and realism is concerned. I would personally love to see non-rt lighting disappear.

But that's just lighting and its not exactly expensive to do that with raster. So you're going to trade something somewhere for high cost raycasting, and may end up with great lighting over shitty environments, low draw distance, heavy LOD, etc.

This is why I'm advocating a slow approach over a fast one. We've seen it already with Turing. Big part of die reserved, high cost for those GPUs, barely a perf/dollar advancement from that gen, and barely a handful of RT titles to use it for.

1. Regarding the conclusions, I don't really see the 6800 XT ($650) as delivering a 100% generational improvement ... nor was it AMDs fastest card, a designation that better fit the Radeon VII.

2. AMD has achieved as close to parity here as we have seen in a long time .... Figuring that a system w/ the 6800 XT might cost say $1650 at the point the snipers are no longer dictating prices, that equates to a 3080 system costing $1700. Outta the box, that's a 3% increase at 1440p as compared to a 3% increase in price. That's pretty much a wash.

3. I don't quite understand the testing comparisons here, article could use clarification. It's stated that the testing was done at the max power setting , wasn't sure if this was rage mode or some other setting. Rage Mode was identified by a blue color, but on overclocking, Max power mode was purple. Both are shown in the fps overclocking graph .... but do not appear in the Temp, sound , power consumption table or performance graphs . "At first I was surprised that I saw no performance gains from overclocking, it seems the power limit is to blame here. Even when overclocked, the power limit will cap the maximum frequencies, so I've set it to the maximum for a second round of OC results. " So , for me anyway, I was not sure what performance and other numbers were association with which of the 3 modes. Perhaps that could be addressed in an update.

4. Comparing the OC performance, the 6800 XT "outta the box" did poorly when overclocked, but in "Max Power" mode The 6800 XT did better % wise than the reference 3080, the the 3080 FE still hit the highest OC in fps.

5. The shocker here is that the AMD card hit 78C w/ load and OC (and I assume not Max power) 2C better than the 3080 FE .... and in an even bigger surprise, AMD hit 31 dbA versus 35 dbA .... This is a significant win for AMD. However, again might not be a fair comparison since not sure what operational mode is represented in each graph.

6 Until we see some the AIB cards reviewed, no way to really know which offers the best overclocked performance / power-sound-temps ratio

7. At this point, if ya could purchase either at MSRP, sound arguments could be made for either card .... I always look at price as a secondary factor becase any conclusion drawn based upon "value" is out the window when price adjustments are made. Nvidia would have been foolish to price the card below $700 when they were the only card available. But I wouldn't expect them to match AMDs MSRP until supply catches up w/ demand. I don't expect to be in a position to make a logical decision recommendation until after the holidays.

Still most purchases will be decided by "brand" rather than "the numbers". Features or advantages that one side has will be deemed "don't matter" by the other and visa versa. To my eyes, I prefer to play games using motion blur reduction (ULMB) over adaptive sync and 17 / 23 games in the test suite can use ULMB at 120 Hz on nvidia cards . G-Sync monitors. It also makes Ray Tracing easily obtainable at 60+ fps

In short, H U G E Kudos to AMD here. After not really having a horse in the race other than thr 5600 XT (and lower price segments) last generation, they have achieved parity in the top "consumer gaming" segment, something they have not done since 2012. Hopefully they can continue up and down the line. Anxious to see the reviews on the AIB cards that our users will actually buy next year

Well spoken sir !
 
So you're going to trade something somewhere for high cost raycasting, and may end up with great lighting over shitty environments, low draw distance, heavy LOD, etc.
No, what I'm saying is that as hardware performance improvements are made and optimizations in software are made, the hit to performance will become a non-issue to the point were RT lighting is very much preferred. Also there are varying degrees to which RT lighting can be applied currently, and to great effect. Improvements will only continue.
 
Hmmm, precomputing lighting clearly works and works well for static lights.

Where RT lights come in are:

1. Less work from the developer
2. Less precompiling
3. Dynamic and/or moving lights can become possible (including reflected lights off of moving surfaces).

Those are the things that were demonstrated with the extra-shiny Stormtrooper demo (
).

It certainly adds an element of realism. But at the same time, there's an element of over-realism. Because we've never seen dynamic lighting in video games before, such demos are overemphasizing them... kinda like how "Wizard of Oz" overemphasized the cartoony colors of color-TV back when color-TV became a thing.

We will need a few generations of video games before we see "realism". For now, we'll see overly shiny cars and overly-shiny helmets that don't really have a realistic atmosphere. Until the artists figure out to use raytracing correctly anyway... I really find a lot of the recent demos to be ridiculous.

--------

The best lighting is lighting that you don't notice. Lighting that sets the mood, provides contrast, and draws the eye towards the important elements of the screen. Lighting doesn't necessarily have to be "realistic". Lightning just has to set the mood correctly.

See Pulp Fiction:

1606150550030.png


The light is behind Samuel Jackson's afro, a very unrealistic position when you consider what is going on in the room. But the lighting draws your attention to the scene (the guns, the faces, etc. etc.) while drawing your eye away from the background. That's cinematography right there: not necessarily being "realistic", but using lights to accomplish a goal... a way for the director to communicate with their audience.

A "realistic" light setup for that room would be dimly lit from only the window in the background. Its clear that the room doesn't have any lights in it, so why can we clearly see their faces? Well, that's cinematography done right. It doesn't worry about the details, its #1 goal is communication with the audience. Realism be damned.

--------

Video Game lighting gets better and better, and more realistic. But video game directors still don't know how to use lighting to communicate well. At some point, we have to recognize that its the video game art direction that's a problem as opposed to technology.
 
Last edited:
I can't say I've played Control. But it seems like a good discussion point.

1606151675423.png


Lets look at this screen: why is the floor shiny? Well, its a good demonstration of a RTRT effect, but... does the shininess of the floor really represent anything? (Aside from "My GPU can play this game and your GPU can't").

1606151761258.png


Why is there a giant shiny mirror in this room?

-----------

Technology demos are cool (and best demonstrated with something like Minecraft RT or Quake RT). But I feel like the next generation of video games needs to start thinking about the "interpretation" of the language of lighting effects. I still feel like this generation of video games errs on the side of 'Cool Graphics Demo' instead of actual cinematography.

With that being said: I'm looking at some screenshots of Control, and it seems like someone is thinking of cinematography.


This screenshot is pretty good: the reflection on the computer monitor draws your eyes towards it. Its a good use of reflective technology, and sets the mood very well.
 
Lets look at this screen: why is the floor shiny? Well, its a good demonstration of a RTRT effect, but... does the shininess of the floor really represent anything? (Aside from "My GPU can play this game and your GPU can't").
Ah, but you're missing a slight point. Environmental lighting FX can be used as movement cues, IE enemies or object in the environment approaching and you see their approach before you actually see them. With RT lighting, this effect happens naturally, as it would IRL. But with non-RT lighting, creating that effect would be a serious task and create a great deal of system resource over-head as has been shown in a mutitude of past games that attempted it(to various levels of success).

And for the record, there are many business buildings in the world that have very shiny and reflective floors like what is shown in the picture above. It's not unusual.
 
And for the record, there are many business buildings in the world that have very shiny and reflective floors like what is shown in the picture above. It's not unusual.

I think where I'm going is that super-clean and shiny floors are unusual in the real world (as well as in cinema). Even if we were to go to a high-end business building scene, such as the "Lobby Shootout" scene from the Matrix... the floors are marble: matte and non-shiny. (
)

Honestly, the only time a shiny floor happened in cinema to my memory is the "Little Princess" scene, where she's mopping a floor. You can see the footsteps of the children in the floor she just mopped: the shininess explicitly calling out how much the other children don't care about the main character anymore.

1606155331408.png


I'm sure there are other ones. But... yeah, even looking back at "The Matrix" lobby shootout: that was a matte floor. Shininess is actually pretty rare in cinema and the real world. Its overrepresented in modern RTRT games.

I do admit that "The Lobby Shootout" in the matrix had shiny columns (granite??) in the background. So shininess can be used to accent special scenes like that: but it shouldn't be used so willy-nilly as today's RTRT demos are doing.
 
Once again, we're at an impasse. However, the point was that AMD's first go at RTRT is solid, if less efficient than NVidia's latest offerings. The landscape is rosy for RTRT going forward.
 
@$650 I may as well just buy another nvidia for $49.99 more. I had really hoped they would come in better priced to temp me.
 
@$650 I may as well just buy another nvidia for $49.99 more. I had really hoped they would come in better priced to temp me.

You were never going to buy AMD. You expect an equal performance gpu that consumes less power, has 6GB vram more and overclocks better for what? $100 less? wtf are you smoking dude. I'm waiting for the best $200 - $250 gpu from whatever company.
One problem i saw with low mid range last gen was AMD did not have any gpu at $220 - $240 while nvidia had 1660, 1660 super and 1660 ti, the next amd was 5600xt at $280.
It would be awesome if a gpu with 5700/2600 super performance is released at $220.
 
Back
Top