• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

I keep being surprised by how people are comparing image quality in games - especially fast-paced ones - by cropping out tiny areas, often from the corners of the screen or similar places where the player is unlikely to focus for much of the time, and often upscaling them just to shout "look, this one is a bit less sharp!"

If the cropped out area looks worse isn't it a logical conclusion that the full image looks worse as well ? The problem is even a tiny bit of blur gets amplified by post processing and the fact that most displays have lower effective resolution when the image is moving.
 
Cool! I can't wait for W1zzard's review where he demonstrates how inferior these cards are compared to NVIDIA.
 
I happen to dislike paid shills.


You mean, when was it that they got caught?
Easy.
The Doom's demo with Ampere super-exclusive-totallynotforshilling-preview.
... so getting exlusive access to hardware for a limited preview makes them shills? They were very explicit about the limitations placed on them for that coverage and that this in no way amounted to a full review. Heck, it was repeated several times. Getting an exclusive preview of a new technology is a great scoop for any tech journalist, and they would all do it if they were asked - possibly with the exception of GN. This does tell us that they don't have extremely strict editorial standards, but it in no way qualifies them as paid shills. That you go that far in your judgement just makes it clear that you're seeing black and white where there are tons of shades of gray. That's your problem, not DF's.
Stone that is at the same focal distance as the bush is sharper on below pic.
Like I said: the below pic has sharper grass (and the stone that's sitting in the grass), the top pic has more detailed leaves. I have no idea which of the two is DLSS, so I'm not arguing for or against either, just that your example is rather poor. You could say the bottom pic has slightly better sharpness overall, but it's essentially impossible to judge as it's clear that the focal points of the two shots are different. As for the stone being "at the same focal distance" - what? Do you mean the same distance from the camera? That's not the same thing ...
 
XBoX One X works with HDMI VRR just like Freesync, even with LFC.


I guess it will be the same with the next consoles and videocards based on the same RDNA2 GPU.
It works but it is not the same. FreeSync offers more like Low Framerate Compensation (FreeSync premium). VRR doesn't have that.
 
If the cropped out area looks worse isn't it a logical conclusion that the full image looks worse as well ? The problem is even a tiny bit of blur gets amplified by post processing and the fact that most displays have lower effective resolution when the image is moving.
Not necessarily: most game cameras have some form of distortion towards the edges of the image (as they are made to emulate real lenses to some degree, so there's distortion towards the edges of the image - it's a 2D projection of a more or less spherical captured image, after all so distortion is unavoidable), meaning the centre of the image is likely to be a tad sharper. Your point about displays having lower effective resolution also goes against this mattering - if your display can't show the difference, it won't matter if the source image is sharper. It's obvious that post processing exacerbates blur, but again the question becomes whether or not it's noticeable.
 
performance is good, price not so much, sigh.

like take the RX6800, beats the 2080ti so it will be a bit faster then an RTX3070, but it also costs 580 dollars vs 500 dollar for the RTX3070, so not really a clear winner in the "What to buy" discussion.

*And has 16 gbs of vram compare to 3070's 8 ....I see nothing but a clear winner here.
 
It works but it is not the same. FreeSync offers more like Low Framerate Compensation (FreeSync premium). VRR doesn't have that.
... that video is explicitly about how, on the LG CX, you don't need FreeSync Premium activated for LFC to work. Did you even read the title, let alone watch it?
 
Unless nvidia comes out with super edition that burns amd both in ram capacity and boost in performance I don't see it winning this one. 300 wats same or better performance than ampere..paired with ryzen 5 these will be overclocking beasts. Only goes to show how much ampere is underperforming and how much it was actually a calculated cut from what it should have been.
 
LoL I'm in trouble! When AMD release Radeon VII this was only model with 16GB and for sure was marketing. But when offer entire middle+high classes with 12GB and 16GB VRAM that is not just marketing!
 
... that video is explicitly about how, on the LG CX, you don't need FreeSync Premium activated for LFC to work. Did you even read the title, let alone watch it?
Apparently I need to catch up since I've read different news saying it can't do it.
 
What is with you and hating on Digital Foundry? Can you explain how they in any way whatsoever are "shills" for anyone? They do in-depth technical reviews with a much higher focus on image quality and rendering quality than pretty much anyone else. Are they enthusiastic about new features that have so far been Nvidia exclusive? Absolutely, but that proves nothing more than that they find these features interesting.
I'll bite for discussion's sake and because their paid review after the Nvidia launch has really upset me.

They are smart people who understand very well how what they are saying might influence the buying decisions of their audience.
Look at this video:
Instead of having a disclaimer that the exact settings for the comparisons to run are given by Nvidia, and that by using any other combination of settings (resolution, RTX/DLSS, detail) results may be quite different, he's simply hyping up the card most of the time, talking about feelings and stuff. That's nowhere near the moral standards oh Hardware Unboxed, Gamer's Nexus or TPU.
Hate is a very strong word, but I dislike the fact that they, as many other review sites and YouTubers, are becoming influencers instead of reviewers.
 
Last edited:
Put simply: if you have to do that, you also need a very good screen
No.

You can "feel" things before you can easily point them out on screen, for starters.

In ars technica "quickly moving mouse" example, entire screen is blurred, no "zoom in" is required. It just helps to focus attention on particular screen.



... so getting exlusive access to hardware for a limited preview makes them shills?
That made them suspect.
The actual misleading video confirmed that they are indeed shills.



Like I said: the below pic has sharper grass
Below picture has blurred out mess, of kind that you could also see on other examples.
 
I'll bite for discussion's sake and because their paid review after the Nvidia launch has really upset me.

They are smart people who understand very well how what they are saying might influence the buying decisions of their audience.
Look at this video:
Instead of having a disclaimer that the exact settings for the comparisons to run are given by Nvidia, and that by using any other combination of settings (resolution, RTX/DLSS, detail) results may be quite different. Instead of that, he's simply hyping up the card most of the time, talking about feelings and stuff. That's nowhere near the moral standards oh Hardware Unboxed, Gamer's Nexus or TPU.
Hate is a very strong word, but I dislike the fact that they, as many other review sites and YouTubers, are becoming influencers instead of reviewers.
I agree that that video was a bit dubious, but no disclaimer? Hm. I think I read their piece rathertthan watched the video, but their written pieces are typically identical to their on-screen scripts. That contains this (second) paragraph:
Full disclosure: I can bring you the results of key tests today, but there are caveats in place. Nvidia has selected the games covered, for starters, and specified 4K resolution to remove the CPU completely from the test results and in all cases, settings were maxed as much as they could be. The games in question are Doom Eternal, Control, Shadow of the Tomb Raider, Battlefield 5, Borderlands 3 and Quake 2 RTX. Secondly, frame-time and frame-rate metrics are reserved for the reviews cycle, meaning our tests were limited to comparisons with RTX 2080 (its last-gen equivalent in both naming and price) and differences had to be expressed in percentage terms, meaning some slight re-engineering of our performance visualisation tools. This work actually proved valuable and the new visualisations will be used elsewhere in our reviews - differences in GPU power do tend to be expressed as percentages, after all.
 
Uh ... only if you actually care about that kind of stuff. Which the average gamer does not whatsoever. This is quite limited enthusiast knowledge. Most gamers' level of knowledge about technical features is more or less on the level of "does the game run on my hardware, y/n?".

you are heavily underestimating young people here (and to be clear, I'm 48 years old). We are speaking about computer gamers here, and many of them know about their hardware.
By the way my entire point was that speaking about "14 years old players" is silly...

No cherry picking here. The only reason not to trust AMD's data here is that they themselves did the testing. The games used, settings used and hardware setups used are all publicized in the slide deck, if you bothered to read. There's no indication that the games they picked are disproportionately favoring AMD GPUs - shown at least partly by the relatively broad span of results in their testing, including games where they lose. Could you imagine Nvidia showing a performance graph from a game where they weren't the fastest? Yeah, there's a difference of attitude here, and AMD marketing for the past couple of years (since Raja Koduri left RTG and Dr. Su took over control there, at least) has been impressively trustworthy for first-party benchmarks.

I didn't even write about not trusting them :rolleyes:
The level of aggressiveness in this thread by AMD supporters is staggering.
I just said we need the independent reviews to actually understand the real performance of the hardware, because data showed ARE cherrypicked (the whole situation is cherrypicked being a marketing presentation) and far from being complete.
I didn't saying they were lying. They are just showing one part of the story, and it is perfectly understandable. To know how much the 6800 is better than 3070 on an average system (one without a Zen 3) we need the review.

As for Fortnite being full of 14-year-olds: that's a given. No, the majority aren't 14 - that would be really weird. But there are tons of kids playing it, and likely the majority of players are quite young. Add to that the broad popularity of the game, and it's safe to assume that the average Fortnite player has absolutely no idea what DLSS is. Heck, there are likely more people playing Fortnite on consoles and phones than people in the whole world who know what DLSS is.

being full of 14 yo doesn't mean the majority are 14 yo as claimed above.
That was my point.
Clearly we are speaking just about PC players here, because people playing Fortnite on a Playstation , a Nintendo or a smartphone clearly could be totally unaware of PC technologies.
 
I agree that that video was a bit dubious, but no disclaimer? Hm. I think I read their piece rathertthan watched the video, but their written pieces are typically identical to their on-screen scripts. That contains this (second) paragraph:
Ahh, yeah, watch the first 5 minutes of the video ;) , the tone and wording is quite different imho.
 
Last edited:
Videos, with known paid shills like DF hyping it.
And then there are eyes to see and brains to use, god forbid.
And even some articles to be found, with, god forbid, actual pics:



At the end of the day, it is a cherry picking game:

View attachment 173721

DLSS 2.0 (these are shots from picture that hypes DLSS by the way):

View attachment 173722


But it demonstrates how delusional DLSS hypers are. As any other image upscaling tech, it has its ups and downs.
WIth 2.0 it is largely the same as with TAA which it based on: it gets blurry, it wipes out small stuff, it struggles with quickly moving stuff.
According to your specs list you DON'T have any hardware that supports DLSS.
I have.
I saw with my eyes on several titles. DLSS + RT are something I like to have on my GPU, not a blurry mess like you are trying to demonstrate.
 
being full of 14 yo doesn't mean the majority are 14 yo as claimed above.
That was my point.
Clearly we are speaking just about PC players here, because people playing Fortnite on a Playstation , a Nintendo or a smartphone clearly could be totally unaware of PC technologies.
I'm pretty sure most people playing FN are kids.

I play with my kid and his friends all the time, and I can see by the nicknames that most of our adversaries are very young, too. Statistics are skewed, because my kid and many of his friends play on their parents' account.

Anyway, the discussion about RTX in FN is pointless, the game is very playable on old hardware and anyone playing it remotely competitively puts shadows and post effects on "LOW", otherwise they cannot aim accurately while in the thick of the battle.
 
I'm pretty sure most people playing FN are kids.

I play with my kid and his friends all the time, and I can see by the nicknames that most of our adversaries are very young, too. Statistics are skewed, because my kid and many of his friends play on their parents' account.

Anyway, the discussion about RTX in FN is pointless, the game is very playable on old hardware and anyone playing it remotely competitively puts shadows and post effects on "LOW", otherwise they cannot aim accurately while in the thick of the battle.

Yeah FN has become my bonding time with my kids as well due to this Covid. My son plays on a 1050Ti and when he comes over to my PC (RTX2080), he asks why does his graphics look so much worse. I just told him recently that he can have my PC soon, as I am getting a 3080.
 
you are heavily underestimating young people here (and to be clear, I'm 48 years old). We are speaking about computer gamers here, and many of them know about their hardware.
By the way my entire point was that speaking about "14 years old players" is silly...
Sorry, but you are very much overestimating the knowledge of the average PC gamer. As people who frequent this and other forums will no doubt agree with me on, the average non-hardware enthusiast neither knows or cares about features like this, and if they know anything it is typically a poorly informed opinion mostly based on marketing and/or a reddit-based game of telephone where what comes out the other end is rather inaccurate.
I didn't even write about not trusting them :rolleyes:
You said they were cherry-picked benchmarks. That means the benchmarks were picked to make them look good, not to be accurate. That makes them unreliable, which means you can't trust them. So yes, you did write about not trusting them.
The level of aggressiveness in this thread by AMD supporters is staggering.
I just said we need the independent reviews to actually understand the real performance of the hardware, because data showed ARE cherrypicked (the whole situation is cherrypicked being a marketing presentation) and far from being complete.
I didn't saying they were lying. They are just showing one part of the story, and it is perfectly understandable. To know how much the 6800 is better than 3070 on an average system (one without a Zen 3) we need the review.
I don't mean to come off as aggressive, so sorry about that. But IMO you're using "cherry picked" wrong. It means to pick what is/looks best or most desirable, so that implies that they are leaving out (potentially a lot of) worse-looking results. Going by recent history from AMD product launches (Zen+, Zen 2, RDNA 1, Renoir), their data has been relatively reliable and in line with reviews. Their numbers also include games where they tie or lose to Nvidia, which while obviously not any kind of proof that these aren't best-case numbers, is a strong indication that they're not just picking out results that make them look good. As I said, there is one reason to not trust these numbers: the fact that they weren't produced by a reliable third party. Beyond that, recent history, the selection of games (broad, including titles where they both win and lose), the relatively detailed test setup notes, and the use of standard settings levels rather than weirdly tuned "AMD optimal" settings (see the Vega launch) are all reasons why one could reasonably expect these numbers to be more or less accurate. Of course the use of games without built-in benchmarks means that numbers aren't directly comparable to sites using the same games but different test scenarios, but that doesn't make the numbers unreliable, just not comparable. I am obviously still not for pre-ordering or even taking this at face value, but your outright dismissal is too harsh. I would be very surprised if these numbers (non-SAM, non Rage mode) were more than 5% off any third-party benchmarks.
being full of 14 yo doesn't mean the majority are 14 yo as claimed above.
That was my point.
Clearly we are speaking just about PC players here, because people playing Fortnite on a Playstation , a Nintendo or a smartphone clearly could be totally unaware of PC technologies.
Saying the majority are 14 was obviously an intentional exaggeration, and taking it that literally is a bit too much for me. Besides that, even the average PC gamer knows very little about hardware or software features. Remember, the average PC gamer plays games on a laptop. The biggest group after that uses pre-built desktops. Custom, self-built or built-to-order desktops are a distant third. And even among that group, I would be surprised if the majority knew anything detailed about what DLSS or any comparable feature is - most gamers spend more time playing games than reading about this kind of stuff.
 
I wonder when/if 30, 36, 40CU cards and other cards will be released??
 
I wonder when/if 30, 36, 40CU cards and other cards will be released??
I'm expecting at least a mention of further RDNA 2 GPUs at CES - IIRC there's a Lisa Su keynote on the books there. Anything earlier than that would be rather weird, given just how close these announced GPUs are launching to the holiday season ending.
 
I'm pretty sure most people playing FN are kids.

I play with my kid and his friends all the time, and I can see by the nicknames that most of our adversaries are very young, too. Statistics are skewed, because my kid and many of his friends play on their parents' account.

Anyway, the discussion about RTX in FN is pointless, the game is very playable on old hardware and anyone playing it remotely competitively puts shadows and post effects on "LOW", otherwise they cannot aim accurately while in the thick of the battle.

Oh RT on Fortnite is pointless for me too, since I'm not planning to play that game on my PC.
But it still is a popular game, so it is not pointless for others.

I wonder when/if 30, 36, 40CU cards and other cards will be released??
most probably early next year.
 
When is the review expected?
 
AMD... they have to get their drivers working proper. I had four Polaris cards (2 x 480, 2 x 580) and one major issue was always software problems. Game breaking bugs, crashes, black screens, you name it, all of it went away by switching to the green team. Again, AMD gfx gear always looks "great!!!!" on paper, crapload of software issues in tow though. They have to fix their drivers and considering how their Adrenaline suite looks and behaves these days, they are going the wrong way.


...
..
.
 
Last edited:
AMD... they have to get their drivers working proper. I had four Polaris cards (2 x 480, 2 x 580) and one major issue was always software problems. Game breaking bugs, crashes, black screens, you name it, all of went away by switching to the green team. Again, AMD gfx gear always looks "great!!!!" on paper, craptop of software issues in tow though. They have to fix their drivers and considering how their Adrenaline suite looks and behaves these days, they are going the wrong way.


...
..
.

I have an RX 480 and never had any issues. Then again, I usually don't update the driver, and have left them for over a year. I kind of agree with the Adrenaline suite, I preferred the older version. Some of the older versions had the option only to install the driver and not the suite. Many times I just did that to avoid all those extras I never used or needed. They need to do that with Adrenaline, have custom install with a good selection to pick from, such as driver only. Only issue is the occasional wattman crash, that doesn't really seem to do anything. I also found windows 1709 and 1809 to be pretty good stability wise, but they are eol now.
 
Back
Top