• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI Radeon RX 6950 XT Gaming X Trio

Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
and if that does not break the forum rules nothing will. What a bang out of order post. At some point they have the stop flaming/trolling I would guess. Just quoting this reply, not responding.
Literally nobody here is flaming, nobody (besides possibly you, though I prefer not to presume malicious intent on the part of people I disagree with) is trolling. My concern is genuine, and not meant as derogatory in any way. I simply see you repeating a destructive pattern that I genuinely hope you can get out of. If that is out of order to you... well, getting things across in writing online can be a challenge. I can only try to make you see things from another perspective. But nobody here is out to get you, and nobody here is interested in inserting/upholding any kind of bias in the benchmark suite.
 
Joined
Jan 11, 2005
Messages
1,491 (0.20/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
@ zx128k

sorry but must ask; you have a RT ray fetish?
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Joined
Feb 20, 2022
Messages
175 (0.17/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
@ zx128k

sorry but must ask; you have a RT ray fetish?
His system "2nd AMD puppy"
FX-8350 vishera
Sapphire RX 580 Nitro+;1450/2000 Mhz

Come now there is no bait, there is just your own personal prejudice. DX12 has been updated with RT, ML and faster storage. Raster benchmarks are also DX12. There is no valid reason to split the raster and ray tracing benchmarks. Given that the sample of games used has 11/25 ray traced games as maximum settings.

Ray Tracing games and raster games should be treated as DX12 games and the result DX12 performance. Everyone gets why you are up in arms against this happening. AMD ignored RT performance and has slowly adopted upscaling because of necessity. These features are core to DX12 which supports Ray Tracing, Machine Learning and faster storage. This is the future of graphics, raster games are legacy and being replaced with ray tracing. As shown by the fact DX12 has moved to Ray tracing, machine learning and faster storage.

One manufacture has good performance for the past and another better performance for now and the future. Ray Tracing is here, its now the mainstream. In the past we never treated new technology like this as some special case. It made sence when the 5700xt had no support for RT but now there is no justification.

The only reason RT is a special sub section now is to protect AMD from the perceived weakness its has in RT performance. That is just bias which should be treated for what it is with contempt.

The playing down of some DX12 features just to make one manufacture look stronger in performance needs to stop. There should be only DX12 games and their features. If one gpu tanks more in performance because Ray Tracing is enabled. Then really they should have spent more time designing a better product. It does not mean benchmarks need to be set in sections to show their gpu in the best light. Raster and then Ray Tracing. The benchmark should state the obvious, they have to much raster performance (which benefits older games) and far to little ray tracing performance (current an future games suffer). Thus in modern AAA games which sell based on their graphics. Performance will be sub par.

Given the 6900xt is basically a 3070ti in Ray tracing performance in heavy ray tracing games. We all get what this will show for the amd 6000 series. This is the central reason the 6000 series is not selling and nvidia is selling out. Reviews need to change, they dont match the market. The Ray Tracing performance is more important. DLSS is a killer feature. This let nvidia get most of the gamer market.
 
Joined
Jun 2, 2017
Messages
9,380 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
His system "2nd AMD puppy"
FX-8350 vishera
Sapphire RX 580 Nitro+;1450/2000 Mhz

Come now there is no bait, there is just your own personal prejudice. DX12 has been updated with RT, ML and faster storage. Raster benchmarks are also DX12. There is no valid reason to split the raster and ray tracing benchmarks. Given that the sample of games used has 11/25 ray traced games as maximum settings.

Ray Tracing games and raster games should be treated as DX12 games and the result DX12 performance. Everyone gets why you are up in arms against this happening. AMD ignored RT performance and has slowly adopted upscaling because of necessity. These features are core to DX12 which supports Ray Tracing, Machine Learning and faster storage. This is the future of graphics, raster games are legacy and being replaced with ray tracing. As shown by the fact DX12 has moved to Ray tracing, machine learning and faster storage.

One manufacture has good performance for the past and another better performance for now and the future. Ray Tracing is here, its now the mainstream. In the past we never treated new technology like this as some special case. It made sence when the 5700xt had no support for RT but now there is no justification.

The only reason RT is a special sub section now is to protect AMD from the perceived weakness its has in RT performance. That is just bias which should be treated for what it is with contempt.

The playing down of some DX12 features just to make one manufacture look stronger in performance needs to stop. There should be only DX12 games and their features. If one gpu tanks more in performance because Ray Tracing is enabled. Then really they should have spent more time designing a better product. It does not mean benchmarks need to be set in sections to show their gpu in the best light. Raster and then Ray Tracing. The benchmark should state the obvious, they have to much raster performance (which benefits older games) and far to little ray tracing performance (current an future games suffer). Thus in modern AAA games which sell based on their graphics. Performance will be sub par.

Given the 6900xt is basically a 3070ti in Ray tracing performance in heavy ray tracing games. We all get what this will show for the amd 6000 series. This is the central reason the 6000 series is not selling and nvidia is selling out. Reviews need to change, they dont match the market. The Ray Tracing performance is more important. DLSS is a killer feature. This let nvidia get most of the gamer market.
That is the most propaganda influenced version of the history of the GPU Wars. Do you even know why ATI/AMD survived the last GPU War? Instead of watching Youtube do yourself a favor and expand your sources of knowledge. It is actually sad because you are so convinced in your own belief instead of the truth.
 
Joined
Feb 20, 2022
Messages
175 (0.17/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
That is the most propaganda influenced version of the history of the GPU Wars. Do you even know why ATI/AMD survived the last GPU War? Instead of watching Youtube do yourself a favor and expand your sources of knowledge. It is actually sad because you are so convinced in your own belief instead of the truth.
source
As a publication that reviews GPUs, we have some recourse, as well. One of our options is to cap the tessellation factor on Radeon cards in future testing. Another is simply to skip Crysis 2 and focus on testing other games. Yet another is to exclude Crysis 2 results from our overall calculation of performance for our value scatter plots, as we’ve done with HAWX 2 in the past.
  • l33t-g4m3r
  • 11 years ago
Trolls trolling trolls. If you don’t like my post editing, don’t argue with me. I’m just clarifying my views and fixing grammar. Neither of you have anything good to say anyway, since your points depend on shaky evidence, or don’t matter since Ageia is defunct. Once you look at the whole picture, the arguments fall through. All I gotta do is point out the hole in the boat, and voila, it sinks.
Physx is a joke, and a detriment to the community, so I don’t get why you are bothering to defend it. The cat’s been out of the bag for a while too, so arguing about it now is like beating a dead horse.
Whether or not Physx had potential doesn’t matter as long as it’s being manipulated to sell video cards. You should support open alternatives instead.

  • ElMoIsEviL
  • 11 years ago

Your post was incredibly silly to read fyi.
People want a “better” implementation of DX11 and thus Tessellation. People do not want the usage of such features used to unreasonable/irrational degrees just to get a one up on the competition as is the case here.

Give it a rest, I can go back over 10 years worth of the crap you guys pull. You are stating the same arguements that have been used for decades to protect Radeon cards. Thus there was a Tessellation Performance subsection was born in reviews. Physx was killed off and innovation died.

Catalyst Hotfix 11.1a: AMD admits defeat in tessellation

It’s official: AMD is going for the driver hack to alleviate their tessellation deficiencies. Basically what they’ve said above is that the driver can artificially limit the tessellation factors. This classifies as a driver cheat for the simple reason that it breaks the DirectX 11 and OpenGL 4.0 APIs. Namely, the application explicitly passes tessellation factors to the tessellation units via the shaders. What AMD is doing here is short-changing the application. The application shader will say: “I want tessellation factor 15”, and the driver says “Sure… but I’m just going to use tessellation factor 5”.

The hack still remains in AMD's drivers.

source

Tessellation Mode

Tessellation Mode enhances the detail of objects by adjusting the number of polygons used for rendering.
Limiting the level of Tessellation can provide higher FPS in games that use high levels of tessellation.
In the example below, the image on the left has x64 Tessellation applied, increasing the detail of the bricks. The image on the right has no Tessellation applied and has less detail.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
His system "2nd AMD puppy"
FX-8350 vishera
Sapphire RX 580 Nitro+;1450/2000 Mhz

Come now there is no bait, there is just your own personal prejudice. DX12 has been updated with RT, ML and faster storage. Raster benchmarks are also DX12. There is no valid reason to split the raster and ray tracing benchmarks. Given that the sample of games used has 11/25 ray traced games as maximum settings.
Are you aware that DX12 and DX12 Ultimate are two separate and different standards? DX12 launched in July 2015 (announced March 2014); DX12U launched in November 2020 (announced March 2020). Ultimate is treated as an extension of 12, but they are separate standards, and many, many GPUs have full DX12 support but not DX12U support. Making a "DX12" benchmark suite that includes DX12U-exclusive features would thus be misleading and unrepresentative of overall DX12 performance. You'll also do well to note that there are both DX11 and Vulkan games in the rasterization suite. It is not, nor should it be, a DX12 suite, as that would make it overall unrepersentative of gaming overall. Now, you could always argue that DX11 and Vulkan benchmarks should be separated out, which would to some extent be a valid argument - but following by that logic, DX12U games would also need to be separated, as they are materially different from non-Ultimate DX12 titles. Different APIs, different featuresets, different hardware requirements.

Also, for the record, at least one of the RT-enabled games ... uses Vulkan. Doom Eternal, that is. That definitely makes it a poor fit for a DX12 or DX12U test suite ....
Everyone gets why you are up in arms against this happening. AMD ignored RT performance and has slowly adopted upscaling because of necessity.
Like ... what? "Ignored" RT performance? They delivered a reasonably competitive solution one generation after Nvidia first launched the feature at all. That is a very fast turnaround for adding support for a feature that up until then didn't exist at all. It would literally not have been possible for them to respond more quickly than they did - it would necessarily need to arrive with their next generation after Turing, which was RDNA2. Now, Ampere's RT is significantly faster - again, nobody here is denying that in any way - but that's not what we're discussing here. Your description of events here just demonstrates a massive and plain-faced irrational level of bias.
These features are core to DX12 which supports Ray Tracing, Machine Learning and faster storage.
What you are describing here is DX12U, not DX12. Also, "faster storage"? You mean DirectStorage, right? That is not "faster storage", it is a technology for allowing direct data transfer from an SSD to VRAM without going through the SSD, and when it gets implemented, also in-GPU decompression of assets. "Faster storage" is not a fitting description of DirectStorage.
This is the future of graphics, raster games are legacy and being replaced with ray tracing.
The future? I thought you were saying it was currently the norm, that rasterization was dead? Now you've got me all confused.
One manufacture has good performance for the past and another better performance for now and the future. Ray Tracing is here, its now the mainstream. In the past we never treated new technology like this as some special case. It made sence when the 5700xt had no support for RT but now there is no justification.
Yes. Yes we did. Dedicated testing for new features that stand out distinctly from others has been the norm across essentially all good benchmarking sites. This has been true for things like PhysX. This has been true for bespoke smaller features like TressFX and HairWorks (though mostly they are just explicitly disabled). This is true for essentially every comparable technology across pretty much every respectable review site out there.

Has any reviewer, ever, two years after the launch of a new API, dedicated the entirety of their test suite to that API?

There might definitely be a tipping point where RT-enabled games should be blended into the overall test suite, but that point is not now. IMO, that point would be when RT benchmarks are relevant for all GPUs across all product stacks - i.e. where it wouldn't break overall performance charts because half the GPUs on the chart can't even run half the benchmarks.
The only reason RT is a special sub section now is to protect AMD from the perceived weakness its has in RT performance. That is just bias which should be treated for what it is with contempt.
Seriously, your conspiratorical logic here is outright disturbing. There are perfectly reasonable arguments for separating these two out, as has been presented to you at length over these past four pages of discussion. You are refusing to even engage in any kind of discussing, just repeating hollow non-arguments centered around the contradictory pairing of "RT is the norm now"/"RT is the future of graphics". You're welcome to disagree with people's judgements, but for that to be taken seriously you need to actually present reasonable, on-topic, impersonal arguments, and not start accusing everyone of bias and conspiracy right out of the gate. All you're achieving by that is antagonizing everyone - even the people inclined to agree with you on some or all points - and make yourself look entirely irrational and unreasonable.
The playing down of some DX12 features just to make one manufacture look stronger in performance needs to stop.
- These are not DX12 features, they are DX12 Ultimate features.
- This does in no way make one manufacturer look stronger, as the "downplayed" features are tested and the results of that testing are included in the conclusion (=overall summary) of the review.
There should be only DX12 games and their features.
So ... you want a test suite that is fundamentally unrepresentative of current game development? Doesn't that seem ... biased to you? Because while DX11 adoption is waning, and Vulkan is relatively niche, both are still relevant, both for (some) new games as well as legacy titles. What you are asking for is a test suite that inherently prioritizes a specific subset of features found in games, because you think those features are more important. The thing about that: your opinions are not universal, and the reviews are not written for you personally. They are meant to paint a broadly representative picture of these products. Limiting the test suite to only DX12 would make the test suite much worse at what it is supposed to do.
If one gpu tanks more in performance because Ray Tracing is enabled. Then really they should have spent more time designing a better product. It does not mean benchmarks need to be set in sections to show their gpu in the best light. Raster and then Ray Tracing. The benchmark should state the obvious, they have to much raster performance (which benefits older games) and far to little ray tracing performance (current an future games suffer). Thus in modern AAA games which sell based on their graphics. Performance will be sub par.
The issue with this is that in quite a few titles, RT makes very little difference in terms of graphics. This is entirely dependent on the implementation - and as with all new tools, learning to use them well takes time, while you might be able to do a comparable job with the tools you're familiar with despite them being much older and technically less capable. So in many titles, baked lighting and reflections can look very good, while a low quality RT implementation can look worse. That obviously isn't the reality in even a majority of games, but it is a relevant issue. Far Cry 6, for example, has been near universally criticized for its RT implementation being ... well, essentially unnoticeable outside of the performance drop.
Given the 6900xt is basically a 3070ti in Ray tracing performance in heavy ray tracing games. We all get what this will show for the amd 6000 series. This is the central reason the 6000 series is not selling and nvidia is selling out. Reviews need to change, they dont match the market. The Ray Tracing performance is more important. DLSS is a killer feature. This let nvidia get most of the gamer market.
You seem to have a rather odd view of both the relative marketshare and mindshare of these companies, as well as recent sales. Radeon GPUs have been just as sold out as Geforce GPUs across every price bracket except for ultra-premium until the past couple of months. It's true that Radeon supplies improved before Geforce supplies did, which is likely due to the same reason that flagship RTX cards have persistently been selling out: they're much better at cryptomining.

Other than that: Nvidia outselling AMD ~4:1 is a continuation of the status quo. There is nothing new about this. On top of this, AMD has been far more supply constrained than Nvidia, due to there being less pressure on Samsung's 8nm node than TSMC 7nm, and AMD on top of this needing to split wafer supplies between CPUs, APUs, GPUs, and console chips. AMD has, put simply, not had the wafer capacity to deliver very high volumes of GPUs since the launch of RDNA2 - which also obviously plays into them being sold out. Which makes it all the more understandable if Nvidia is gaining market share. This is not due to the market prioritizing RT performance above all else, it is down to cryptomining + Nvidia's massive mindshare advantage + AMD supply constraints + Nvidia's economics and their deals with OEMs (which goes some way towards explaining why it's so hard for AMD to get a real foothold in the laptop space, for example, despite delivering better efficiency than Ampere).


Nobody here is denying that if RT is what you're looking for, Nvidia delivers the best performance. Heck, you don't even need benchmarks within this generation to tell that - it's a completely established fact, beyond any doubt, and no tweaking from AMD's side will change that. If anything, this means highlighting RT benchmarks is less important (until the next generation from both sides comes around, as that'll make them interesting again): that side of the picture is fixed, it isn't changing, it is well established and not subject for debate. If RT is what you're looking for, Nvidia GPUs are clearly superior. If you don't care about RT - which is reality for many, many people, especially those in the market for more affordable GPUs, which generally don't really handle RT passably today (say, RTX 3050, RX 6600, both of which deliver passable performance at 1080p in very lightweight RT titles, but unplayable performance in heavier ones), separating out RT performance from the general performance assessment better lets you judge how the GPU will perform in games you'll be playing at settings you'll actually be using.

There is also a discussion to be had about whether high end/flagship cards should have different test routines than midrange and low end ones. I tend to think so (and, for example, TechSpot/Hardware Unboxed generally does this), but it also results in (much) more work for the reviewer, and is thus less feasible for many sites. Tradeoffs always need to be made. But I also acknowledge that testing on a level playing field is valuable in and of itself - even if it's not the choice I would make myself in an unconstrained situation. Luckily there are still quite a few GPU reviewers out there, so we can check multiple to ensure that the perspectives of one aren't skewing our impressions.

You, on the other hand, are arguing that your specific perspective - which on top of being yours, rather than universal, has quite a few logical flaws, inconsistencies, and seems to be based on a factually untrue understanding of current reality - should be the only one presented. I see no reason why such an argument should be accepted, or even let stand uncontested, as it inherently makes testing less valuable for everyone else. This isn't bias. It's broad representativity, including a broad featureset to ensure that as many scenarios as possible are tested. You are explicitly arguing for more limited and myopic testing. Remember: you're always allowed to read a review and choose which parts of the results are the most important to you. That's how reviews are supposed to work. You do not, on the other hand, have the right to dictate that only what you see as important should be tested, and nothing else. If you want that, go start your own review site.
 
Joined
Feb 20, 2022
Messages
175 (0.17/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
Are you aware that DX12 and DX12 Ultimate are two separate and different standards? DX12 launched in July 2015 (announced March 2014); DX12U launched in November 2020 (announced March 2020). Ultimate is treated as an extension of 12, but they are separate standards, and many, many GPUs have full DX12 support but not DX12U support. Making a "DX12" benchmark suite that includes DX12U-exclusive features would thus be misleading and unrepresentative of overall DX12 performance. You'll also do well to note that there are both DX11 and Vulkan games in the rasterization suite. It is not, nor should it be, a DX12 suite, as that would make it overall unrepersentative of gaming overall. Now, you could always argue that DX11 and Vulkan benchmarks should be separated out, which would to some extent be a valid argument - but following by that logic, DX12U games would also need to be separated, as they are materially different from non-Ultimate DX12 titles. Different APIs, different featuresets, different hardware requirements.

Also, for the record, at least one of the RT-enabled games ... uses Vulkan. Doom Eternal, that is. That definitely makes it a poor fit for a DX12 or DX12U test suite ....

Like ... what? "Ignored" RT performance? They delivered a reasonably competitive solution one generation after Nvidia first launched the feature at all. That is a very fast turnaround for adding support for a feature that up until then didn't exist at all. It would literally not have been possible for them to respond more quickly than they did - it would necessarily need to arrive with their next generation after Turing, which was RDNA2. Now, Ampere's RT is significantly faster - again, nobody here is denying that in any way - but that's not what we're discussing here. Your description of events here just demonstrates a massive and plain-faced irrational level of bias.

What you are describing here is DX12U, not DX12. Also, "faster storage"? You mean DirectStorage, right? That is not "faster storage", it is a technology for allowing direct data transfer from an SSD to VRAM without going through the SSD, and when it gets implemented, also in-GPU decompression of assets. "Faster storage" is not a fitting description of DirectStorage.

The future? I thought you were saying it was currently the norm, that rasterization was dead? Now you've got me all confused.

Yes. Yes we did. Dedicated testing for new features that stand out distinctly from others has been the norm across essentially all good benchmarking sites. This has been true for things like PhysX. This has been true for bespoke smaller features like TressFX and HairWorks (though mostly they are just explicitly disabled). This is true for essentially every comparable technology across pretty much every respectable review site out there.

Has any reviewer, ever, two years after the launch of a new API, dedicated the entirety of their test suite to that API?

There might definitely be a tipping point where RT-enabled games should be blended into the overall test suite, but that point is not now. IMO, that point would be when RT benchmarks are relevant for all GPUs across all product stacks - i.e. where it wouldn't break overall performance charts because half the GPUs on the chart can't even run half the benchmarks.

Seriously, your conspiratorical logic here is outright disturbing. There are perfectly reasonable arguments for separating these two out, as has been presented to you at length over these past four pages of discussion. You are refusing to even engage in any kind of discussing, just repeating hollow non-arguments centered around the contradictory pairing of "RT is the norm now"/"RT is the future of graphics". You're welcome to disagree with people's judgements, but for that to be taken seriously you need to actually present reasonable, on-topic, impersonal arguments, and not start accusing everyone of bias and conspiracy right out of the gate. All you're achieving by that is antagonizing everyone - even the people inclined to agree with you on some or all points - and make yourself look entirely irrational and unreasonable.

- These are not DX12 features, they are DX12 Ultimate features.
- This does in no way make one manufacturer look stronger, as the "downplayed" features are tested and the results of that testing are included in the conclusion (=overall summary) of the review.

So ... you want a test suite that is fundamentally unrepresentative of current game development? Doesn't that seem ... biased to you? Because while DX11 adoption is waning, and Vulkan is relatively niche, both are still relevant, both for (some) new games as well as legacy titles. What you are asking for is a test suite that inherently prioritizes a specific subset of features found in games, because you think those features are more important. The thing about that: your opinions are not universal, and the reviews are not written for you personally. They are meant to paint a broadly representative picture of these products. Limiting the test suite to only DX12 would make the test suite much worse at what it is supposed to do.

The issue with this is that in quite a few titles, RT makes very little difference in terms of graphics. This is entirely dependent on the implementation - and as with all new tools, learning to use them well takes time, while you might be able to do a comparable job with the tools you're familiar with despite them being much older and technically less capable. So in many titles, baked lighting and reflections can look very good, while a low quality RT implementation can look worse. That obviously isn't the reality in even a majority of games, but it is a relevant issue. Far Cry 6, for example, has been near universally criticized for its RT implementation being ... well, essentially unnoticeable outside of the performance drop.

You seem to have a rather odd view of both the relative marketshare and mindshare of these companies, as well as recent sales. Radeon GPUs have been just as sold out as Geforce GPUs across every price bracket except for ultra-premium until the past couple of months. It's true that Radeon supplies improved before Geforce supplies did, which is likely due to the same reason that flagship RTX cards have persistently been selling out: they're much better at cryptomining.

Other than that: Nvidia outselling AMD ~4:1 is a continuation of the status quo. There is nothing new about this. On top of this, AMD has been far more supply constrained than Nvidia, due to there being less pressure on Samsung's 8nm node than TSMC 7nm, and AMD on top of this needing to split wafer supplies between CPUs, APUs, GPUs, and console chips. AMD has, put simply, not had the wafer capacity to deliver very high volumes of GPUs since the launch of RDNA2 - which also obviously plays into them being sold out. Which makes it all the more understandable if Nvidia is gaining market share. This is not due to the market prioritizing RT performance above all else, it is down to cryptomining + Nvidia's massive mindshare advantage + AMD supply constraints + Nvidia's economics and their deals with OEMs (which goes some way towards explaining why it's so hard for AMD to get a real foothold in the laptop space, for example, despite delivering better efficiency than Ampere).


Nobody here is denying that if RT is what you're looking for, Nvidia delivers the best performance. Heck, you don't even need benchmarks within this generation to tell that - it's a completely established fact, beyond any doubt, and no tweaking from AMD's side will change that. If anything, this means highlighting RT benchmarks is less important (until the next generation from both sides comes around, as that'll make them interesting again): that side of the picture is fixed, it isn't changing, it is well established and not subject for debate. If RT is what you're looking for, Nvidia GPUs are clearly superior. If you don't care about RT - which is reality for many, many people, especially those in the market for more affordable GPUs, which generally don't really handle RT passably today (say, RTX 3050, RX 6600, both of which deliver passable performance at 1080p in very lightweight RT titles, but unplayable performance in heavier ones), separating out RT performance from the general performance assessment better lets you judge how the GPU will perform in games you'll be playing at settings you'll actually be using.

There is also a discussion to be had about whether high end/flagship cards should have different test routines than midrange and low end ones. I tend to think so (and, for example, TechSpot/Hardware Unboxed generally does this), but it also results in (much) more work for the reviewer, and is thus less feasible for many sites. Tradeoffs always need to be made. But I also acknowledge that testing on a level playing field is valuable in and of itself - even if it's not the choice I would make myself in an unconstrained situation. Luckily there are still quite a few GPU reviewers out there, so we can check multiple to ensure that the perspectives of one aren't skewing our impressions.

You, on the other hand, are arguing that your specific perspective - which on top of being yours, rather than universal, has quite a few logical flaws, inconsistencies, and seems to be based on a factually untrue understanding of current reality - should be the only one presented. I see no reason why such an argument should be accepted, or even let stand uncontested, as it inherently makes testing less valuable for everyone else. This isn't bias. It's broad representativity, including a broad featureset to ensure that as many scenarios as possible are tested. You are explicitly arguing for more limited and myopic testing. Remember: you're always allowed to read a review and choose which parts of the results are the most important to you. That's how reviews are supposed to work. You do not, on the other hand, have the right to dictate that only what you see as important should be tested, and nothing else. If you want that, go start your own review site.
Another wall of non sense. userbenchmark states it well. Their RX 6950XT is 5th.
Whilst the drought in the GPU market continues, street prices for AMD cards are around 50% lower than comparable (based on headline average fps figures) Nvidia cards. Many experienced users simply have no interest in buying AMD cards, regardless of price. AMD’s Neanderthal marketing tactics seem to have come back to haunt them. Their brazen domination of social media platforms including youtube and reddit resulted in millions of users purchasing sub standard products. Be wary of sponsored reviews with cherry picked games that showcase the wins and ignore the losses. Experienced gamers know all too well that headline average fps are worthless when they are accompanied with stutters, random crashes, excessive noise and a limited feature set.
Their RX 6900xt review.
The RX 6900-XT assumes the flagship position in AMD’s latest RX 6000 series of GPUs which deliver a huge generational jump in performance. The $1,000 USD 6900-XT offers a small improvement (11% more compute units) over the already launched $650 USD RX 6800-XT. AMD have upgraded the single fan cooler to a more efficient triple fan solution, perhaps indicating a shift in focus from benchmark busting headlines to user experience. Following the widespread issues that users faced with the 5000 and Vega series, we are cautiously optimistic that AMD have taken steps to ensure driver and hardware stability. Given the value for money now offered by both Nvidia and AMD, with the 3070 and 6800-XT, it is difficult to recommend any thousand dollar graphics cards to most gamers. 16GB of VRAM is a key feature of the 6900-XT. At higher resolutions and detail settings, performance can bottleneck without sufficient GPU memory. AMD's marketers often cherry pick obscure games with high res/settings, the details of which are rarely disclosed, then compare the results with cards that have less memory. In that scenario, the cards with less memory look weaker than they would at 1080p. The 1080p results are sometimes omitted, or worse, partially omitted and frame drops are conveniently ignored. Most users will see little benefit in gaming at high resolutions. Without drastic price cuts (MSRP $1000 USD) and miraculous marketing via countless promo videos and sponsored reviews, the 6900 XT will struggle to compete, partly because it lacks RTX+DLSS which is required for the best gaming experience in class leading titles such as Cyberpunk 2077. Users should be wary of AMD’s army of social media accounts, they aim to dupe shoppers any way they can.

Why are AMD cards being outsold over 14:1 by NVidia and steam hardware survey is just nvidia 3000 series. For the very thing that most reviews ignore. Gamers dont care about the raster performance in Cyberpunk 2077, Metro Exodus (and more importantly the free enhanced edition upgrade) or any other DX12(DXR using) ray tracing supporting games. They care about Ray Tracing performance and DLSS. Thus, reviewers need to change their reviews to meet the expectations of gamers. In the past maximum settings were used in benchmarks, its time to return to that and not omit ray tracing or DLSS. We should not respect limiting DX12 version so that results look better.

This happened with AMD's poor Tessellation performance, lack of a Physx feature and is now happening because of AMD's poor Ray Tracing performance. AMD's complete lack of any tensor or xmx cores should not be argued away with sophistry but explained as a lack of inovation. No special sections for DXR games. They are DX12 games just like any other DX12 games. No cherry picking of games that perform best on amd hardware.

Just maximum settings and AMD forum trolls learn to live with it.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Another wall of non sense.
You're welcome to present some arguments as to why it is nonsense, if you find it so disagreeable.
userbenchmark states it well. Their RX 6950XT is 5th.

Their RX 6900xt review.
The fact that you're quoting Userbenchmark - the only benchmarking suite that explicitly states that it is fundamentally biased against AMD in both CPU and GPU reviews - isn't helping you.
Why are AMD cards being outsold over 14:1 by NVidia and steam hardware survey is just nvidia 3000 series.
a) Steam Hardware Surveys aren't representative of actual usage. An example, though obviously anecdotal: I haven't gotten a Hardware Survey prompt a single time on my main PC. On the other hand, I have gotten one on every other PC I own or have access to - my HTPC, my laptop, and my travel desktop. That is in spite of the main PC running Steam probably 30x as much as the other PCs combined. Steam crucially doesn't claim that their survey is representative of marketshares overall.

This obviously doesn't mean Nvidia isn't outselling AMD - it would be shocking if they weren't.
For the very thing that most reviews ignore. Gamers dont care about the raster performance in Cyberpunk 2077, Metro Exodus (and more importantly the free enhanced edition upgrade) or any other DX12(DXR using) ray tracing supporting games. They care about Ray Tracing performance and DLSS. Thus, reviewers need to change their reviews to meet the expectations of gamers. In the past maximum settings were used in benchmarks, its time to return to that and not omit ray tracing or DLSS. We should not respect limiting DX12 version so that results look better.
Lol, no. Gamers care about "can I play this game, and on what settings and with what FPS". RT might be an aspirational feature, but for the vast majority of gamers it is entirely out of reach currently, and will continue to be so for quite a while. Most gamers don't even have an RT-capable GPU!
This happened with AMD's poor Tessellation performance, lack of a Physx feature and is now happening because of AMD's poor Ray Tracing performance.
AMD's tesselation performance was always quite poor, nobody is denying that. And their driver workaround is ... well, kinda crap. But ... who cares at this point? This is not comparable. AMD still supports RT to the exact same levels as Nvidia, just with lower performance. There is no feature difference, just a performance difference, and not one that seems likely to result in some kind of driver or software trickery to make it appear better than it is.
AMD's complete lack of any tensor or xmx cores should not be argued away with sophistry but explained as a lack of inovation.
How so? How are tensor cores improving gaming performance? Tensor cores are mainly in RTX GPUs because those same chips are sold as the RTX A series of pro GPUs, where those cores are used for neural network workloads. They are used a bit in the RT denoising process, which AMD does in shaders instead (which might go some way towards explaining their performance deficit). Still, I don't see the problem. It is quite natural and logical that the company that is literally 1/10th the size of the other has less money for R&D towards new and exotic features. That is expected, and only a downside if those features have major utility. And, as we have seen with RDNA2, they are quite capable of delivering working solutions in short spans of time if necessary.
No special sections for DXR games. They are DX12 games just like any other DX12 games.
No. DX12 and DX12U are not the same, they are different feature levels that are supported differently by different hardware. Please stop trying to make this into something that it isn't.
No cherry picking of games that perform best on amd hardware.
... yet you are arguing for cherry-picking games that perform the best on Nvidia hardware, and that is somehow less problematic? Uhhh... okay then. To be clear: I'm not even talking about RT here. You've argued for a DX12U-only test suite - no Vulkan, no DX11, nothing. That is explicitly arguing for a test suite that cherry picks examples where Nvidia will perform better.
Just maximum settings and AMD forum trolls learn to live with it.
That is literally what we have: maximum settings, plus maximum settings with RT. Both are already there. What more do you want?

It's quite telling that you keep insisting that everyone disagreeing with you is oh-so-biased, and clearly trolling, yet you, the one arguing for an inherently limited and biased test suite, one that is explicitly selected to not be broadly representative but specifically tuned towards specific features where Nvidia is clearly ahead (which, again, nobody is denying, including these reviews), cannot be biased at all. Yes, that makes perfect sense. Sure. Have fun with that one.
 
Joined
Feb 20, 2022
Messages
175 (0.17/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
You're welcome to present some arguments as to why it is nonsense, if you find it so disagreeable.

The fact that you're quoting Userbenchmark - the only benchmarking suite that explicitly states that it is fundamentally biased against AMD in both CPU and GPU reviews - isn't helping you.

a) Steam Hardware Surveys aren't representative of actual usage. An example, though obviously anecdotal: I haven't gotten a Hardware Survey prompt a single time on my main PC. On the other hand, I have gotten one on every other PC I own or have access to - my HTPC, my laptop, and my travel desktop. That is in spite of the main PC running Steam probably 30x as much as the other PCs combined. Steam crucially doesn't claim that their survey is representative of marketshares overall.

This obviously doesn't mean Nvidia isn't outselling AMD - it would be shocking if they weren't.

Lol, no. Gamers care about "can I play this game, and on what settings and with what FPS". RT might be an aspirational feature, but for the vast majority of gamers it is entirely out of reach currently, and will continue to be so for quite a while. Most gamers don't even have an RT-capable GPU!

AMD's tesselation performance was always quite poor, nobody is denying that. And their driver workaround is ... well, kinda crap. But ... who cares at this point? This is not comparable. AMD still supports RT to the exact same levels as Nvidia, just with lower performance. There is no feature difference, just a performance difference, and not one that seems likely to result in some kind of driver or software trickery to make it appear better than it is.

How so? How are tensor cores improving gaming performance? Tensor cores are mainly in RTX GPUs because those same chips are sold as the RTX A series of pro GPUs, where those cores are used for neural network workloads. They are used a bit in the RT denoising process, which AMD does in shaders instead (which might go some way towards explaining their performance deficit). Still, I don't see the problem. It is quite natural and logical that the company that is literally 1/10th the size of the other has less money for R&D towards new and exotic features. That is expected, and only a downside if those features have major utility. And, as we have seen with RDNA2, they are quite capable of delivering working solutions in short spans of time if necessary.

No. DX12 and DX12U are not the same, they are different feature levels that are supported differently by different hardware. Please stop trying to make this into something that it isn't.

... yet you are arguing for cherry-picking games that perform the best on Nvidia hardware, and that is somehow less problematic? Uhhh... okay then. To be clear: I'm not even talking about RT here. You've argued for a DX12U-only test suite - no Vulkan, no DX11, nothing. That is explicitly arguing for a test suite that cherry picks examples where Nvidia will perform better.

That is literally what we have: maximum settings, plus maximum settings with RT. Both are already there. What more do you want?

It's quite telling that you keep insisting that everyone disagreeing with you is oh-so-biased, and clearly trolling, yet you, the one arguing for an inherently limited and biased test suite, one that is explicitly selected to not be broadly representative but specifically tuned towards specific features where Nvidia is clearly ahead (which, again, nobody is denying, including these reviews), cannot be biased at all. Yes, that makes perfect sense. Sure. Have fun with that one.
I have already shown using results from another site which show that almost all the games in this review are the ones that perform the best in raster on AMD hardware. That would be this post.
DX12 supports RT, ML and faster storage. No reason to have another section for Ray Tracing. No reason at all. If a feature is new and slower, its 100% what people want to see tested on their gpu's. Not AMD cards are slower lets reduce the setting for RT and no DLSS. Lets have RT in another section and make sophistry argues why its not important. Like for example, "you cant see the difference." Then focus on raster. Gamers bought their cards for the settings being side lined. There should be no section for raster and DXR. If the games maximum setting is DXR, then DXR it is and to hell with the company that is slower.

I 100% didnt get a 3080 ti to play the DX12 DXR games in raster. Why like so many others did I not get a 6900xt. Easy DXR performance. Why is AMD decimated in steam hardware survey? Poor DXR performance and no DLSS like feature. The more you make these arguments against nvidia the less amd cards sell. NVidia completely control the gamer market at this point. The cards that are selling all have tensor cores. Yet shills are arguing FSR will kill nvidia DLSS. The market is nvidia at this point, these people are divorced from reality.

It is clear you have no real argument to offer, just insults/sophistry and thus this is my last post here.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,380 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I have already shown using results from another site which show that almost all the games in this review are the ones that perform the best in raster on AMD hardware. That would be this post.
DX12 supports RT, ML and faster storage. No reason to have another section for Ray Tracing. No reason at all. If a feature is new and slower, its 100% what people want to see tested on their gpu's. Not AMD cards are slower lets reduce the setting for RT and no DLSS. Lets have RT in another section and make sophistry argues why its not important. Like for example, "you cant see the difference." Then focus on raster. Gamers bought their cards for the settings being side lined. There should be no section for raster and DXR. If the games maximum setting is DXR, then DXR it is and to hell with the company that is slower.

I 100% didnt get a 3080 ti to play the DX12 DXR games in raster. Why like so many others did I not get a 6900xt. Easy DXR performance. Why is AMD decimated in steam hardware survey? Poor DXR performance and no DLSS like feature. The more you make these arguments against nvidia the less amd cards sell. NVidia completely control the gamer market at this point. The cards that are selling all have tensor cores. Yet shills are arguing FSR will kill nvidia DLSS. The market is nvidia at this point, these people are divorced from reality.

It is clear you have no real argument to offer, just insults and thus this is my last post here.
If you are so pumped about DXR why do own a paltry 3080TI compared to the 3090.

If you are so pumped about DXR why do own a paltry 3080TI compared to the 3090.
I have already shown using results from another site which show that almost all the games in this review are the ones that perform the best in raster on AMD hardware. That would be this post.
DX12 supports RT, ML and faster storage. No reason to have another section for Ray Tracing. No reason at all. If a feature is new and slower, its 100% what people want to see tested on their gpu's. Not AMD cards are slower lets reduce the setting for RT and no DLSS. Lets have RT in another section and make sophistry argues why its not important. Like for example, "you cant see the difference." Then focus on raster. Gamers bought their cards for the settings being side lined. There should be no section for raster and DXR. If the games maximum setting is DXR, then DXR it is and to hell with the company that is slower.

I 100% didnt get a 3080 ti to play the DX12 DXR games in raster. Why like so many others did I not get a 6900xt. Easy DXR performance. Why is AMD decimated in steam hardware survey? Poor DXR performance and no DLSS like feature. The more you make these arguments against nvidia the less amd cards sell. NVidia completely control the gamer market at this point. The cards that are selling all have tensor cores. Yet shills are arguing FSR will kill nvidia DLSS. The market is nvidia at this point, these people are divorced from reality.

It is clear you have no real argument to offer, just insults and thus this is my last post here.
I love the comment that Nvidia are running the Gaming space. When every Game from Xbox and Sony will be made for AMD hardware. You are really showing that you must be an Nvidia employee if not dare I say Fan boy. Do you really believe Nvidia released their drivers to bring some good to the World?
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I have already shown using results from another site which show that almost all the games in this review are the ones that perform the best in raster on AMD hardware. That would be this post.
The problem is, those results don't really show that. Those results show that at 1080p and 1440p, nearly all games "favor AMD", with only two notable exceptions.

What does this tell us? That of the two GPUs tested, the AMD one is the fastest at those resolutions. Which is also what Techspot's own chart tells us - it says right at the top, the 3080 is 3% slower at 1080p, 5% slower at 1440p, and 5% faster at 2160p. In general. There are always outliers on both sides, but if anything, it's the titles where Nvidia has a large advantage that break from the norm. Using these charts to argue that picking games where a 6900 XT is faster than a 3080 - which is, in general, most games - is picking games that favor AMD is fundamentally disingenuous and a misrepresentation of realty. Below 2160p, the 6900 XT is faster than the 3080 generally, across any reasonably broad selection of games. What you are arguing for here is selectively choosing games where Nvidia does better.
DX12 supports RT, ML and faster storage.
Again: DX12 Ultimate, not DX12.
No reason to have another section for Ray Tracing. No reason at all.
Except that there have been several reasons provided here, and you haven't presented a single compelling argument against those reasons, save for "RT is the norm, rasterization is dead", which is both not an argument, and untrue.
If a feature is new and slower, its 100% what people want to see tested on their gpu's.
100%? I would say 100% want to see games tested on their GPU. Quite broadly. Beyond that, people have all kinds of preferences. Would a significant number want to see RT tested? Absolutely. That's why it is tested! Testing it is useful and informative, hence its inclusion in the test suite.
Not AMD cards are slower lets reduce the setting for RT and no DLSS.
Huh? DLSS is tested on all Nvidia cards. You literally can't benchmark DLSS on an AMD card, so including it would have been rather odd.
Lets have RT in another section and make sophistry argues why its not important. Like for example, "you cant see the difference."
... is that argument intentionally misleading or based on deceptive logic? 'Cause otherwise it isnt' sophistry ... And you still haven't presented a single coherent argument as to why your suggested test suite would be better for a GPU review.
Gamers bought their cards for the settings being side lined. There should be no section for raster and DXR. If the games maximum setting is DXR, then DXR it is and to hell with the company that is slower.
Can you show any kind of statistic - say, a representative poll of customer preferences - that supports gamers buying GPUs specifically for RT, above all else? Because other than that, all you have is circumstantial evidence interpreted through your own bias.
I 100% didnt get a 3080 ti to play the DX12 DXR games in raster. Why like so many others did I not get a 6900xt. Easy DXR performance.
I have no problem playing DXR games on my 6900 XT, so ... I don't see the issue. It's slower, sure, but works perfectly fine. What you are demonstrating here is a preference. Reviewers have no obligation to cater reviews to your preferences - their job is to be neutral and present as broad and representative an overview as possible. You are arguing for promoting a niche interest above all else and forcing a much narrower focus on reviews. That will directly make them less useful to everyone who isn't a RT-or-nothing diehard like you seem to be.
Why is AMD decimated in steam hardware survey? Poor DXR performance and no DLSS like feature.
No. While Nvidia has gained some marginal share on the Steam hardware survey in recent months, their relative strength is still mostly stable. Nothing has changed significantly since the introduction of RTX. If these two things were of notable importance, we would see a significant change towards Nvidia since their introduction. We haven't, outside of one blip in December 2020 that subsequent surveys have shown to have been erroneous.

Another issue is that the Steam hardware survey only deals in percentages, and doesn't say anything about actual numbers, which is a significant issue when we know PC gaming has exploded in recent years. That sales have accelerated massively, yet the two major actors are stable in their market share, that clearly demonstrates that nothing significantly is changing between them.

Also, have you heard of FSR?
The more you make these arguments against nvidia the less amd cards sell.
Lolwut? Are you actually claiming that our arguments here (which, for the record, aren't against Nvidia, they are in favor of broadly representative reviews) actually affect real-world GPU sales numbers? That is the funniest thing I've heard all day - please explain how that works. Please.
NVidia completely control the gamer market at this point.
Nvidia has the same ~80% dGPU marketshare they have had for most of the past decade. Nothing is changing significantly there.
The cards that are selling all have tensor cores.
Because Nvidia has the highest marketshare, a major mindshare advantage (which they have had for the past decade if not longer) and has those cores are in all their midrange and up GPUs. People would need to go out of their way to not buy a GPU with tensor cores. That does not in any way support a statement saying people are specifically buying GPUs because of tensor cores.
Yet shills are arguing FSR will kill nvidia DLSS.
I thought you said AMD had no "DLSS-like feature"? What is FSR then?
The market is nvidia at this point, these people are divorced from reality.
The market has been mostly Nvidia for ... well, at least the past decade. Nothing has changed much. You're presenting this as if something has changed radically recently. It hasn't.
It is clear you have no real argument to offer, just insults/sophistry and thus this is my last post here.
That's, what, your fourth last post? A bit inconsistent there. It's rather funny to see you keep saying we "have no real arguments" while you consistently have no arugments beyond the hollow "RT is the future/the norm/dominant/whatever", with no follow-up or actual argumentation. I would recommend taking a look in a mirror, and maybe stop assuming that people disagreeing with you are all irrational/shills/fanboys/trolls/crazy/conspiring against you - that's a much better and healthier approach to any discussion.
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Good thing AMD launched these new cards. Maybe some people will now understand the power consumption of current and future NVIDIA cards and why it has almost nothing to do with architecture efficiency.

You could make the 6950 XT consume 1000 W for another 10% performance gain. Some people would probably buy that.
 
Joined
Feb 20, 2022
Messages
175 (0.17/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
The problem is, those results don't really show that. Those results show that at 1080p and 1440p, nearly all games "favor AMD", with only two notable exceptions.

What does this tell us? That of the two GPUs tested, the AMD one is the fastest at those resolutions. Which is also what Techspot's own chart tells us - it says right at the top, the 3080 is 3% slower at 1080p, 5% slower at 1440p, and 5% faster at 2160p. In general. There are always outliers on both sides, but if anything, it's the titles where Nvidia has a large advantage that break from the norm. Using these charts to argue that picking games where a 6900 XT is faster than a 3080 - which is, in general, most games - is picking games that favor AMD is fundamentally disingenuous and a misrepresentation of realty. Below 2160p, the 6900 XT is faster than the 3080 generally, across any reasonably broad selection of games. What you are arguing for here is selectively choosing games where Nvidia does better.

Again: DX12 Ultimate, not DX12.

Except that there have been several reasons provided here, and you haven't presented a single compelling argument against those reasons, save for "RT is the norm, rasterization is dead", which is both not an argument, and untrue.

100%? I would say 100% want to see games tested on their GPU. Quite broadly. Beyond that, people have all kinds of preferences. Would a significant number want to see RT tested? Absolutely. That's why it is tested! Testing it is useful and informative, hence its inclusion in the test suite.

Huh? DLSS is tested on all Nvidia cards. You literally can't benchmark DLSS on an AMD card, so including it would have been rather odd.

... is that argument intentionally misleading or based on deceptive logic? 'Cause otherwise it isnt' sophistry ... And you still haven't presented a single coherent argument as to why your suggested test suite would be better for a GPU review.

Can you show any kind of statistic - say, a representative poll of customer preferences - that supports gamers buying GPUs specifically for RT, above all else? Because other than that, all you have is circumstantial evidence interpreted through your own bias.

I have no problem playing DXR games on my 6900 XT, so ... I don't see the issue. It's slower, sure, but works perfectly fine. What you are demonstrating here is a preference. Reviewers have no obligation to cater reviews to your preferences - their job is to be neutral and present as broad and representative an overview as possible. You are arguing for promoting a niche interest above all else and forcing a much narrower focus on reviews. That will directly make them less useful to everyone who isn't a RT-or-nothing diehard like you seem to be.

No. While Nvidia has gained some marginal share on the Steam hardware survey in recent months, their relative strength is still mostly stable. Nothing has changed significantly since the introduction of RTX. If these two things were of notable importance, we would see a significant change towards Nvidia since their introduction. We haven't, outside of one blip in December 2020 that subsequent surveys have shown to have been erroneous.

Another issue is that the Steam hardware survey only deals in percentages, and doesn't say anything about actual numbers, which is a significant issue when we know PC gaming has exploded in recent years. That sales have accelerated massively, yet the two major actors are stable in their market share, that clearly demonstrates that nothing significantly is changing between them.

Also, have you heard of FSR?

Lolwut? Are you actually claiming that our arguments here (which, for the record, aren't against Nvidia, they are in favor of broadly representative reviews) actually affect real-world GPU sales numbers? That is the funniest thing I've heard all day - please explain how that works. Please.

Nvidia has the same ~80% dGPU marketshare they have had for most of the past decade. Nothing is changing significantly there.

Because Nvidia has the highest marketshare, a major mindshare advantage (which they have had for the past decade if not longer) and has those cores are in all their midrange and up GPUs. People would need to go out of their way to not buy a GPU with tensor cores. That does not in any way support a statement saying people are specifically buying GPUs because of tensor cores.

I thought you said AMD had no "DLSS-like feature"? What is FSR then?

The market has been mostly Nvidia for ... well, at least the past decade. Nothing has changed much. You're presenting this as if something has changed radically recently. It hasn't.

That's, what, your fourth last post? A bit inconsistent there. It's rather funny to see you keep saying we "have no real arguments" while you consistently have no arugments beyond the hollow "RT is the future/the norm/dominant/whatever", with no follow-up or actual argumentation. I would recommend taking a look in a mirror, and maybe stop assuming that people disagreeing with you are all irrational/shills/fanboys/trolls/crazy/conspiring against you - that's a much better and healthier approach to any discussion.
If he follows me home can I keep him?
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Oh no someone is arguing against me and I don't have any arguments to support my view, damn, how can I deflect this??? HELP!!!
FTFY
 
Joined
Feb 20, 2022
Messages
175 (0.17/day)
System Name Custom Watercooled
Processor 10900k 5.1GHz SSE 5.0GHz AVX
Motherboard Asus Maximus XIII hero z590
Cooling XSPC Raystorm Pro, XSPC D5 Vario, EK Water Blocks EK-CoolStream XE 360 (Triple Fan) Radiator
Memory Team Group 8Pack RIPPED Edition TDPPD416G3600HC14CDC01 @ DDR4-4000 CL15 Dual Rank 4x8GB (32GB)
Video Card(s) KFA2 GeForce RTX 3080 Ti SG 1-Click OC 12GB LHR GDDR6X PCI-Express Graphics Card
Storage WD Blue SN550 1TB NVME M.2 2280 PCIe Gen3 Solid State Drive (WDS100T2B0C)
Display(s) LG 3D TV 32LW450U-ZB and Samsung U28D590
Case Full Tower Case
Audio Device(s) ROG SupremeFX 7.1 Surround Sound High Definition Audio CODEC ALC4082, ESS SABRE9018Q2C DAC/AMP
Power Supply Corsair AX1000 Titanium 80 Plus Titanium Power Supply
Mouse Logitech G502SE
Keyboard Logitech Y-BP62a
Software Windows 11 Pro
Benchmark Scores https://valid.x86.fr/2rdbdl https://www.3dmark.com/spy/27927340 https://ibb.co/YjQFw5t
Wow cant match a RTX 3080 10GB at any resolution source
https://cdn.mos.cms.futurecdn.net/e9RMMVJfF3yyqQg9iRfFrk-970-80.png



https://cdn.mos.cms.futurecdn.net/Pc96jBc47moMGgofVf8J6K-1920-80.png


https://cdn.mos.cms.futurecdn.net/e9RMMVJfF3yyqQg9iRfFrk-1920-80.png

Guess reviews will just have to focus on raster performance, with this RTX 3090 ti destroyer.


https://cdn.mos.cms.futurecdn.net/7pUD9NAadnSaGy4vhx9QqE-1920-80.png


O well never mind. Why is the RX 6900xt below the RTX 3080 12GB.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Wow cant match a RTX 3080 10GB at any resolution source
https://cdn.mos.cms.futurecdn.net/e9RMMVJfF3yyqQg9iRfFrk-970-80.png



https://cdn.mos.cms.futurecdn.net/Pc96jBc47moMGgofVf8J6K-1920-80.png


https://cdn.mos.cms.futurecdn.net/e9RMMVJfF3yyqQg9iRfFrk-1920-80.png

Guess reviews will just have to focus on raster performance, with this RTX 3090 ti destroyer.


https://cdn.mos.cms.futurecdn.net/7pUD9NAadnSaGy4vhx9QqE-1920-80.png


O well never mind. Why is the RX 6900xt below the RTX 3080 12GB.
... so, more cherry-picking results you like? Cool. Different games perform differently, so especially a relatively narrow selection (like the 8 games above) has a lot of room for variance. And Nvidia has a notable advantage in RT. I can't say I see anything here contradicting what anyone has been saying? AFAIK nobody here has been saying that the 6950 XT is faster than a 3090 Ti at 2160p, nor that Nvidia doesn't have a significant advantage in RT. This is well established and widely accepted, and is equally represented in TPU reviews. I still don't see what your overall problem is.
 
Joined
Jan 11, 2005
Messages
1,491 (0.20/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
His system "2nd AMD puppy"
FX-8350 vishera
Sapphire RX 580 Nitro+;1450/2000 Mhz

Come now there is no bait, there is just your own personal prejudice. DX12 has been updated with RT, ML and faster storage. Raster benchmarks are also DX12. There is no valid reason to split the raster and ray tracing benchmarks. Given that the sample of games used has 11/25 ray traced games as maximum settings.

Ray Tracing games and raster games should be treated as DX12 games and the result DX12 performance. Everyone gets why you are up in arms against this happening. AMD ignored RT performance and has slowly adopted upscaling because of necessity. These features are core to DX12 which supports Ray Tracing, Machine Learning and faster storage. This is the future of graphics, raster games are legacy and being replaced with ray tracing. As shown by the fact DX12 has moved to Ray tracing, machine learning and faster storage.

One manufacture has good performance for the past and another better performance for now and the future. Ray Tracing is here, its now the mainstream. In the past we never treated new technology like this as some special case. It made sence when the 5700xt had no support for RT but now there is no justification.

The only reason RT is a special sub section now is to protect AMD from the perceived weakness its has in RT performance. That is just bias which should be treated for what it is with contempt.

The playing down of some DX12 features just to make one manufacture look stronger in performance needs to stop. There should be only DX12 games and their features. If one gpu tanks more in performance because Ray Tracing is enabled. Then really they should have spent more time designing a better product. It does not mean benchmarks need to be set in sections to show their gpu in the best light. Raster and then Ray Tracing. The benchmark should state the obvious, they have to much raster performance (which benefits older games) and far to little ray tracing performance (current an future games suffer). Thus in modern AAA games which sell based on their graphics. Performance will be sub par.

Given the 6900xt is basically a 3070ti in Ray tracing performance in heavy ray tracing games. We all get what this will show for the amd 6000 series. This is the central reason the 6000 series is not selling and nvidia is selling out. Reviews need to change, they dont match the market. The Ray Tracing performance is more important. DLSS is a killer feature. This let nvidia get most of the gamer market.
Thx! you answered positively to my question as i see!

i feel humbled to see you mentioned my specs; thank you! i felt the tears in my eyes!!
i can only imagine you siting on the gold throne from where you look down to us, the raster peasants, writing to us by starting with sys.specs, showing your matchless superiority!
what a good feeling must be!! it must feel almost like an orgasmic ray pleasure i think!; i'm really glad that my king found the best way to increase His e-peen and hope he can achieve infinite growth!
good kings used to throw bones to peasants and i hope you'll do the same; we have a "Buy/Sell/Trade/Giveaway Forum" maybe you can sell/giveaway 1 mm e-peen ; i'm sure you'll have buyers!! at least you won't have to rent a storage unit to keep it all there; it will be cheaper and you can make money for the next upgrade! don't forget - when you post there picture of the product is mandatory!
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,654 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
... so, more cherry-picking results you like? Cool. Different games perform differently, so especially a relatively narrow selection (like the 8 games above) has a lot of room for variance. And Nvidia has a notable advantage in RT. I can't say I see anything here contradicting what anyone has been saying? AFAIK nobody here has been saying that the 6950 XT is faster than a 3090 Ti at 2160p, nor that Nvidia doesn't have a significant advantage in RT. This is well established and widely accepted, and is equally represented in TPU reviews. I still don't see what your overall problem is.
Just ignore the pest
 
Top