• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Which NVIDIA Ampere card would you buy?

Which NVIDIA Ampere card would you buy?

  • RTX 3090

    Votes: 5,036 9.3%
  • RTX 3080

    Votes: 11,737 21.7%
  • RTX 3070

    Votes: 6,502 12.0%
  • Will skip this generation

    Votes: 3,868 7.2%
  • Will wait for RDNA2

    Votes: 22,325 41.3%
  • Using AMD

    Votes: 2,758 5.1%
  • Buying a next-gen console

    Votes: 1,810 3.3%

  • Total voters
    54,036
  • Poll closed .
Higher power consumption is justified whenever you want it to because there is no objective correspondence between the two. People switch their opinions as often as the direction of the wind when it's about NVIDIA vs when it's about AMD.

If you argue that say a 300W AMD card heats up your room to an unbearable degree then any 300W NVIDIA card would do that no matter how fast it would be. If you suddenly think it's not that bad because it reached a certain performance threshold, then you never actually cared about power. There is no problem with that, but I can't help but point out the hypocrisy.

LOL you can choose how much power your GPU use with Afterburner but you really can't choose its performance level can't you ?
No one said you have to use 3090 at 100% power limit, my 2080 Ti has max TGP of 310W but I choose to undervolt to 250W because the extra 60W does not net any meaningful FPS gain.
Now the 3090 at 250W TGP would still be 50% faster than 2080 Ti, which I sorely need to maintain 100+ FPS on modern games.
Yes I care about performance per watt because I live in tropical climate where I use AC 24/7.
I don't really care much about performance per dollars as long as it fit within my budget.
 
Last edited:
LOL you can choose how much power your GPU use with Afterburner but you really can't choose its performance level can't you ?
No one said you have to use 3080 at 100% power limit, my 2080 Ti has max TGP of 310W but I choose to undervolt to 250W because the extra 60W does not net any meaningful FPS gain.
Now the 3090 at 250W TGP would still be 50% faster than 2080 Ti, which I sorely need to maintain 100+ FPS on modern games.
Yes I care about performance per watt because I live in tropical climate where I use AC 24/7
So your 400watts comments are what then, hypocrisy, you think Vega 64 really pulls that power, it wasn't sold using a hyperbolic amount and doesn't in actual use but smearing bullshits real easy isn't it.
I'll throw you back a personally a bitchy comment , you asked for it.
I hope you like the resale value of that 2080ti:p:D
 
And the thread has been mostly sane by TPU standards...

They would rather use all that 7nm wafers making Zen 3 and the rest for consoles where they make good profit.
The question is are all 7nm wafers equal? IOW, do CPUs & GPUs use the same node/process? TAM/GM for server/mobile has caused a strategy pivot for AMD. I think AMD will walk back on desktop GPUs in a way that may surprise. Mobile/APU is another story.

The narrative I am hearing is totally different than what we heard when AMD pushed the bounds of power use on Polaris.
I think the RX480 issue was more a case of drawing too much juice from PCIE.

Somewhat related, I have a theory about the 12-pin microfit connector. It's not simply a case of less board real estate in itself, but rather a combination of strategies to reduce PCB size, simplify topology, increase component density, while improving power distribution & stability. The multi 8-pin connectors have been an issue for Nv because they prefer a per power connector power limit setup. They've dealt with it previously with shunt resistors for each supply eg 2x8-pin & 1xPCIE slot, monitoring/feedback circuits, & buck converter-esque type power balance circuits that reduce any imbalance of power draw. This can happen if too much power is drawn from one 8-pin resulting in power throttling. The new 12-pin has fewer, heavier gauge VIA through pins, meaning a single monitoring stage & reduction in circuitry. This will have benefits for 3080/3090 stability esp with increased power slider limits. Quite neat.

I'll be interested in how/what TPU tests. Embargo on slide deck & gaming arch white paper ends today/tomorrow, I think.
 
So your 400watts comments are what then, hypocrisy, you think Vega 64 really pulls that power, it wasn't sold using a hyperbolic amount and doesn't in actual use but smearing bullshits real easy isn't it.
I'll throw you back a personally a bitchy comment , you asked for it.
I hope you like the resale value of that 2080ti:p:D

Huh, in case you don't understand (which you don't), I was talking about performance per watt. Vega 64 use 400Watt but the performance sucks, it's even slower than the GTX 1080.

If it's any comfort to you I just keep my current rig in my vacation home, my next rig will be Zen3 4960X + 3090 :)
 
LOL you can choose how much power your GPU use with Afterburner but you really can't choose its performance level can't you ?
No one said you have to use 3090 at 100% power limit, my 2080 Ti has max TGP of 310W but I choose to undervolt to 250W because the extra 60W does not net any meaningful FPS gain.
Now the 3090 at 250W TGP would still be 50% faster than 2080 Ti, which I sorely need to maintain 100+ FPS on modern games.
Yes I care about performance per watt because I live in tropical climate where I use AC 24/7.
I don't really care much about performance per dollars as long as it fit within my budget.

It doesn't matter to what you limit the power consumption to, N number of watts will mean N of watts irrespective of the performance per watt. Also by definition if you chose the power you also change the performance level and vice versa. Obviously ...

You hear people say all the time : "That card uses too much power, ah nah, that's too much I don't want a furnace". That's clearly BS, because when a faster GPU comes along suddenly they chose rather happily said furnace as shown by these new cards which use a lot of power.
 
Huh, in case you don't understand (which you don't), I was talking about performance per watt. Vega 64 use 400Watt but the performance sucks, it's even slower than the GTX 1080.

If it's any comfort to you I just keep my current rig in my vacation home, my next rig will be Zen3 4960X + 3090 :)
Your doubling down on bullshit in your bedroom more like.
400watts stock( https://www.techpowerup.com/gpu-specs/radeon-rx-vega-64.c2871) ,proof or your full of shit, and owning said card, good luck.
Logged as epean Pcmr guy btw so this conversation gonna be short.
 
Last edited:
It doesn't matter to what you limit the power consumption to, N number of watts will mean N of watts irrespective of the performance per watt. Also by definition if you chose the power you also chose the performance level. Obviously ...

You hear people say all the time : "That card uses too much power, ah nah, that's too much I don't want a furnace". That's clearly BS, because when a faster GPU comes along suddenly they chose rather happily to furnace.

When you lower the power consumption by N% amount, it doesn't mean the performance lose by N% amount, check the performance/watt curve

Max_Q_4.png


Without any overclocking, I can just limit my 2080 Ti to 80% power target and the performance lost is only 5%.
 
When you lower the power consumption by N% amount, it doesn't mean the performance lose by N% amount, check the performance/watt curve

Of course the relation between the two is non linear but the point is that it changes nonetheless.

Without any overclocking, I can just limit my 2080 Ti to 80% power target and the performance lost is only 5%.

Who cares about that other than you ? I may want my card at it's stock power target, someone else overclocked. It's a choice based on a purely subjective reasoning.
 
Of course the relation between the two is non linear but the point is that it changes nonetheless.



Who cares about that other than you ? I may want my card at it's stock power target, someone else overclocked. It's a choice based on a purely subjective reasoning.

I understand that AMD fans dont bother with performance per watt much it's a metric that others care beside performance per dollar :).
And i'm not alone since Nvidia pretty much dominate the market atm
 
I understand that AMD fans

So that's what this is really about. I am an AMD fanboy for not adhering to your bizarre idea that there is some magical point at which power becomes justified. Whatever that even means.

Right.

I understand that AMD fans dont bother with performance per watt

You know how I know NVIDIA fans don't care about that either ? Just look up a chart with perf/watt ratings, you wont find a 2080ti , or a 3090, or any other high end card on top. You'll find stuff like the 1660ti or 1050, but you didn't buy either of those cards did you ? Huh.
 
Last edited:
I understand that AMD fans dont bother with performance per watt much it's a metric that others care beside performance per dollar :).
And i'm not alone since Nvidia pretty much dominate the market atm
So all those buyer's, bought for performance per watt.
They're are many variables to choose from to buy a GPU, few indeed, pick that metric.

Performance, price, features all way in 100% more important for 95% of buyer's.
 
Last edited by a moderator:
Perf /W is a joke of a metric. So many people live in the weeds man... so many. :(

All 99% of users want to know is how well it performs, how much power it uses (total, not some FPS /W bologna) and how noisy it is (price and looks too of course).

Blame 'fanatics' for acting like wallpaper changing their opinion with the times/what is going on with their precious and not holding strong to beliefs. The same jokers touting AMD/NV efficiency are the same clowns that jock AMD/NV when their power is higher and laying blame to the other side. It's ridiculous, some of the users here (most forums), and how they think.
 
Perf /W is a joke of a metric. So many people live in the weeds man... so many. :(

All 99% of users want to know is how well it performs, how much power it uses (total, not some FPS /W bologna) and how noisy it is (price and looks too of course).

Blame 'fanatics' for acting like wallpaper changing their opinion with the times/what is going on with their precious and not holding strong to beliefs. The same jokers touting AMD/NV efficiency are the same clowns that jock AMD/NV when their power is higher and laying blame to the other side. It's ridiculous, some of the users here (most forums), and how they think.

Well it more about accepting that people just have different needs.
Some people want the best performance for their dollar.
Some just want the best performance per watt (which would always belong to the most expensive GPU).
The majority will look for the right balance between the 2 metrics.

Let just say you have a Laptop with limited cooling potential, then perf/watt will become the most limiting factor, the more efficient GPU will give more FPS. You can look into 5600M vs 2060 laptops and see why efficiency matter.

So that's what this is really about. I am an AMD fanboy for not adhering to your bizarre idea that there is some magical point at which power becomes justified. Whatever that even means.
Right.
You know how I know NVIDIA fans don't care about that either ? Just look up a chart with perf/watt ratings, you wont find a 2080ti , or a 3090, or any other high end card on top. You'll find stuff like the 1660ti or 1050, but you didn't buy either of those cards did you ? Huh.

HUH ?
performance-per-watt_3840-2160.png
 
Well it more about accepting that people just have different needs.
Some people want the best performance for their dollar.
Some just want the best performance per watt (which would always belong to the most expensive GPU).
The majority will look for the right balance between the 2 metrics.

Let just say you have a Laptop with limited cooling potential, then perf/watt will become the most limiting factor, the more efficient GPU will give more FPS. You can look into 5600M vs 2060 laptops and see why efficiency matter.
The 1% who cares about these contrived metrics I'm not concerned with. Nor is this context about discrete desktop GPUs have anything to do with laptops.

I'd take the time to address your other points, but, I'll let the others who are already circling at the chum in the water at it. ;)
 
The 1% who cares about these contrived metrics I'm not concerned with. Nor is this context about discrete desktop GPUs have anything to do with laptops.

I'd take the time to address your other points, but, I'll let the others who are already circling at the chum in the water at it. ;)

Yeah I would really like to see where you get your data that say only 1% care about perf/watt, knowing that we had a discussion about you and your friend who never bother to overclock PC :)
If you didn't know, Air Con also have limited cooling capability, so my room act the same as a laptop in a way.
 
eah I would really like to see where you get your data that say only 1% care about perf/watt, knowing that we had a discussion about you and your friend who never bother to overclock PC :)
Que? Not sure what you are saying or some past conversation, but..........yeah. I'll let that marinate a minute.

PP /W is, IMO, a metric few care about (better, few SHOULD care about)... mmmkay? Take the big picture view. :)
 
Que? Not sure what you are saying or some past conversation, but..........yeah. I'll let that marinate a minute.

PP /W is, IMO, a metric few care about (better, few SHOULD care about)... mmmkay? Take the big picture view. :)

So in your opinion, please share why people should care if 3090 use 350W when the performance/watt would still be 25% higher than 2080Ti, unless those who complain are salty AMD fans of course :D
 
Behavior in this thread is turning into a schoolyard with name-calling, finger-pointing and childishness. Return to the topic on hand, respond civilly or not at all, or reply permissions will be removed for you. Only warning.
Guidelines for those that need a refresher are here.
 
So in your opinion, please share why people should care if 3090 use 350W when the performance/watt would still be 25% higher than 2080Ti, unless those who complain are salty AMD fans of course :D
Sorry, I think you read my post wrong. I don't care for this metric at all and the waffling/good for the goose, isn't good for the gander responses that inevitably follow.

Tieing this back in for those who believe it may be off topic...

.......I wouldn't worry about performance per watt at all. Power use is power use is power use for a given card, period. Users don't need to play with numbers to find a card's worth. They aren't running hundreds of these in a data center where it really matters. People want to know how a card performs and how much power it uses. If there are a few users who want to mathzzz and play in the minutia, that is OK, but it, to me, is a fairly useless metric in it all. I'm not digging in the minutia for Ampere when there will be other values that tell a 30K' view and is plenty for most users. We'll agree to disagree that PP /W is a useful metric to buy Ampere (or RDNA2 or GPUs in general). :)

If you didn't know, Air Con also have limited cooling capability, so my room act the same as a laptop in a way.
:roll: o_O
 
Last edited:
I'm waiting to see what AMD has to offer. I want something that'll offer twice the performance of my 980Ti, but be priced $500 or less. Right now with Nvidia the 3070 looks like it could be that card for me, but I'm not in any rush to run out and dump a bunch of cash. I patiently wait for 3000 reviews and to see what AMD has up their sleeves.
 
I'm waiting to see what AMD has to offer. I want something that'll offer twice the performance of my 980Ti, but be priced $500 or less. Right now with Nvidia the 3070 looks like it could be that card for me, but I'm not in any rush to run out and dump a bunch of cash. I patiently wait for 3000 reviews and to see what AMD has up their sleeves.
I think I read that the 3070 has just a single 8 pin PCI-E (although some heavily overclocked versions may require more), if it ends up that it is around 2080Ti performance that is surprising, although welcome.
 
I'm waiting to see what AMD has to offer. I want something that'll offer twice the performance of my 980Ti, but be priced $500 or less. Right now with Nvidia the 3070 looks like it could be that card for me, but I'm not in any rush to run out and dump a bunch of cash. I patiently wait for 3000 reviews and to see what AMD has up their sleeves.

Well people are selling 2080 Ti for 500usd now, might as well grab one. At least the 11GB framebuffer is more enticing than 8GB and the 2080 Ti only use 40W more.
RTX and DLSS performance would likely to be the same between the two.
 
full GA104 with G6X 16GB 600GBs only 20% slower than 3080, 6 months from now. if 3080 can hit 1,8-1,9x 2080 performance, that is 1,5-1,6.
 
Well people are selling 2080 Ti for 500usd now, might as well grab one. At least the 11GB framebuffer is more enticing than 8GB and the 2080 Ti only use 40W more.
RTX and DLSS performance would likely to be the same between the two.
A fair point, you might also have more luck buying one given a sellout is likely.
 
yeah people were selling GTX 780 for $500 few months before it dropped do 300 brand new. Good on them.
 
Back
Top