• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GPU Test System Update March 2021

Have you ever done any internal testing on whether Ray-tracing and/or DLSS affects the power consumption of the video card? Since there is a lot of dedicated silicon (moreso in Nvidia cards than AMD cards), I'm curious if these features add any more power when utilized, or if balances out with bottlenecks elsewhere in the GPU.
It does, but the card is running into its power limiter nearly all the time, so it's hard to isolate the measurement. Also the changed FPS rate drastically alters the power consumption from the rest of the GPU.

slimmer bars
my charting engine (I wrote it, because nothing good exists out there) doesn't support slimmer bars, and not sure if worth it, how would these be sorted?

check that they arent causing a performance degradation due to ECC
any suggestions how?

thought about this, decided against it, because the DR gains make no sense considering the investment. you're almost never CPU limited enough for the memory speed to make enough of a difference. Look for GPU limited results at Techspot. No game benefits from 32 GB, except for horrible flight simulator, and noobs will end up overspending because my reviews use 32 GB

I wish you guys add compare tools
Our charts are simple image files that everybody can link to, and does. This greatly outweights the benefits dynamic charts offer, it's purely a business decision.

2. Sorry for the world of hurt, but I'd love to see Energy Efficiency for 1080p as well because absolute most gamers still game at this resolution, so it does matter.
There should be no significant difference because both are properly GPU limited now

@mechtech @TheoneandonlyMrK @BSim500: Alright, I'll add RX 580 and GTX 1060 6 GB
 
thought about this, decided against it, because the DR gains make no sense considering the investment. you're almost never CPU limited enough for the memory speed to make enough of a difference. Look for GPU limited results at Techspot. No game benefits from 32 GB, except for horrible flight simulator, and noobs will end up overspending because my reviews use 32 GB

G.Skill TridentZ 16GBx2 3600mhz cas16 at 220usd
G.Skill TridentZ 8GBx2 4000mmhz cas19 at 150usd
That only 70usd more (or 45% more dough) for 100% more RAM capacity, either way spending another 70usd is nothing for a 2000usd+ rig. Also people looking to add another 2x8GB DIMM to their system later on will have a hard time finding the same ICs for all their modules, thus create a hassle to sell the old kit and buy the 2x16GB.

I'm fairly certain that every GPU faster than the RTX 2080 are running into some sort of CPU/RAM bottlenecking at 1080p Ultra in TPU testing suite.
Just look at how 6800XT and 6900XT fair against 2080Ti in all 3 resolution from 1080p-1440p-4k
6800XT: 121% - 127% - 128%
6900XT: 126% - 133% - 137%
Now RDNA2 cards are not supposed to get stronger at higher res at all (compared to 2080Ti). This make high-end GPUs look weaker than they actually are at 1080p, once Lovelace and RDNA3 come out, they will get even more RAM bottlenecked at 1080p or even 1440p.
 
Last edited:
thanx very much for your hard work and thx to all tpu staff !

the choice to take every 2-series with normal and super seems a bit odd to me-just for the diversity i would love to see 980\ti and 1080\ti just because a lot of peeps will not buy a new now with these prices and stay on their old tech.
but i know a 980ti should be at 1070 level and a 1080ti should be at 2070 if i am correct
 
Yall not gonna wait for the metro graphics update? Or is it already out?
Is there even a date? I'll rebench in a couple of months anyway, maybe it's out until then

the choice to take every 2-series with normal and super seems a bit odd to me
I always worked by generations. Given the feedback here I might change that going forward

are running into some sort of CPU/RAM bottlenecking at 1080p Ultra
Still not convinced, maybe when we go DDR5 we go 32 GB. There's only few tests that are somewhat CPU limited at 1080p, mostly due to DX11 and AMD architecture, DR wont make that much of a difference. BF5, Borderlands 3, DoS 2, Far Cry 5

But good to hear that the biggest issue is DR, means the rest of my testing is good :)
 
Great update, and...... insertion force.... and great update.

Might be interesting to see what a couple old high end cards perform like for comparison. But looking forward to see how it goes.


Is there going to be any image quality comparisons in Ray tracing coming up?
 
Bigger RAM capacity doesn't affect much, less so when @W1zzard will test with BAR enabled which lessens the system RAM usage and uses more of the VRAM.
 
One more item of constructive feedback for your charts. I find sometimes when there are many colours I have trouble differentiating a few due to colour deficiency, I’m sure there are other in the same boat. Using higher contrast colours could help.
 
First of all a big thank you for all the effort you put into benchmarking all these video cards again!

I do have a question to which I did not find the answer to: what refresh rates are used for measuring the idle and multi-monitor power consumption?
 
This is very interesting

as by switching the rig to Ryzen it actually pushed the RX 6000 series ahead of Nvidia GPUs in non RT games


AMD

relative-performance_1920-1080 (1)amd.png


Intel

relative-performance_1920-1080.png
 
what refresh rates are used for measuring the idle and multi-monitor power consumption?
Great question, 60 Hz, let me add that to the texts in all future reviews

This is very interesting

as by switching the rig to Ryzen it actually pushed the RX 6000 series ahead of Nvidia GPUs in non RT games
ReBAR and Game selection, too
 
Great question, 60 Hz, let me add that to the texts in all future reviews


ReBAR and Game selection, too
Was the game selection really that big of a difference
 
well, might amd 5800 cpu is ok, but is it shoud wait few days to get best cpu for gaming,another else score is useless..moust of them.
also mobo, what must use with amd cpu heard low level...

games schoice i wonder why battlefield not leave,its so clear amd way builded that should be history..if review want stay look objective.
plz, stop amd camp support....its useless still comare amd 7nm cpu for intel 14nm cpu...think all know why...anyway, not point to tell too much glory bfore intel get its own 7m for side..then let see...soon.
at least intel adler lake show so clear what cpu is lausy or excellent...lol
but,i must say, intel 14nm doing good job...and sure its eat more power than 7nm cpu...stop compare..lol

for power eat,its nice that example gpu we seen power eat only gpu,not hole system,its good.

i respect Techpowerup review much, its one of best site to check battles for cpu and gpu,and also test. hope its stay.

all best.

p.s. still should wait intel 11900k...well you change it for sure when intel 10nm adler lake is out.
 
It is very popular, why would you think otherwise? It's included in the benchmarks and will be there until 2022 at least. Number of Steam reviews, amount of noise on social media, personal dice roll
Oh sorry I miss understood, thought that it wasn't there.
I don't use any of those in bold, and with that I now know there will never be two 6700 XT vs one 6900 XT, or two RTX 3060 Ti vs one RTX 3090 on mGPU DX 12 with raytracing on & off because no one uses it.
 
I don't use any of those in bold, and with that I now know there will never be two 6700 XT vs one 6900 XT, or two RTX 3060 Ti vs one RTX 3090 on mGPU DX 12 with raytracing on & off because no one uses it.
Multi GPU has been the scope of several previous special reviews, but nowadays it's simply not worth it because it's just a huge waste of money due to lack of game support

still should wait intel 11900k...well you change it for sure when intel 10nm adler lake is out.
I'm totally open to changing back to Intel, if they are good, we'll see. I just didn't want to wait any longer with the rebench, I have to rebench all my CPUs next, in just two weeks, and still have to figure out some new tests.
 
Done like a true German :D ;)

I'd love covid to be over to come "up there" (from Munich everything is "above" :P :D ) and take you for a beer/coffee whatever you prefer for everything you do here. Your tests are always extremely professional, in-depth and sheer massive (amount of pictures, detail analysis, explanation and even the conclusion alone is sometimes as long as some "full tests/reviews" on other pages are) and very enjoyable to read.

Thanks for everything you do :)
 
@W1zzard Any particular reason why you guys went with 5800x instead of 5900x or 5950x? I know the differences are negligible in most game titles but still are there.

It would seem you have a lot of work to do with all these benchmarks and graphics cards. Thanks for doing this.
 
Any particular reason why you guys went with 5800x instead of 5900x or 5950x? I know the differences are negligible in most game titles but still are there.
This is literally explained in the text?

It would seem you have a lot of work to do with all these benchmarks and graphics cards
It's all finished already
 
Shows how useless RT still is for most people, crazy that using RT on a 3060 brings it down to the level of an old 1660 basically.

Games looked great without it, I would be happy if it was ditched from future GPUs altogether.
 
Back
Top