• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 40 Series "AD104" Could Match RTX 3090 Ti Performance

Well, they almost matched a 2060 on a 128-bit 3050 ( clocked at same 14 Gbps)

if mainstream Ada cardd jump uyp to 18-20 gbps GDDr6, then ythey can easily handle 3070 at 192 bits,.
My point is, nvidia is slowing down 4070 intentionally, so they can still sell their old crap high, want a good gpu without restriction then buy a gddrx6 384 bit gpu and that will be a 4090, even the 4080 was slowed down, soon the xx80 that used to be flagship will be midrange.
 
If you're truly after efficiency in the sub-100W range, you can actually go AMD, because raytracing is kind of irrelevant on these lower-end cards so AMD's disadvantage also becomes irrelevant.
My concern is noise, so I'm looking for something to be silent when playing a movie and reasonably quiet when playing a game, and powerful enough to run the games that I want (which I haven't picked yet). And I'm not going super compact, I will probably use a large HTPC case or a midi-tower placed horizontally. And since I'm designing and building the TV-bench myself, I might even go for something like a Fractal Design 7 Compact. I was thinking of a mid-range GPU, but the high TDP has been a concern, but with power-limiting I might consider even RTX 3070 if it suddenly turns up super cheap, or RTX 40 if I'm still looking this winter.

I think this fall and winter will be a good time to pick up some good deals with new CPUs and GPUs coming.
 
On that one review for that specific game. For everything else, cards are tested with RTRT on. Why? Because CyberPunk2077 is the new Crysis. It will bring any system it runs on to it's knees and in the way that W1zard usually conducts testing, would bring the frames rates to a crawl which would interfere with testing results on power usage. So for that ONE game, RTRT is turned off.

From every one of TPUs GPU reviews this year:

"Gaming: Cyberpunk 2077 is running at 2560x1440 with Ultra settings and ray tracing disabled. We ensure the card is heated up properly, which ensures a steady-state result instead of short-term numbers that won't hold up in long-term usage."

I guess you could be remembering from another site (you could post a link to something backing up one of your claims once in a while) but TPU tests video card power consumption with no ray tracing enabled unless @W1zzard has had a typo in there from the beginning.
 
Read in another article, 4070 non ti will come with 7040 cudacores, 12 GB GDDR6 around 300W. Would be easy to undervolt to 200-250W. If the price is ok, easy upgrade for my 2070, no other hardware changes needed.
 
i heard we pay to many attention to this attention whore



criticize me away, i don't mind, it will still not change facts and tech limitations
I didn't say you were wrong, but telling ppl to shutup about it defeats constructive criticism.
 
Oh?

Hmm.. Specifically, page 8.

No mention of power readings taken with ray tracing enabled. I looked in their methodologies and no mention of power readings taken with ray tracing enabled. It's irrelevant by now but it's pretty clear that reviewers don't seem to enable RT when taking power readings on a video card. Maybe they do it once or twice to see what the power differences are but it's not part of the regular review process.
 
No mention of power readings taken with ray tracing enabled.
No, but it didn't mention exclusion either and in power testing unless something is specifically excluded, it is to be presumed inclusive.
It's irrelevant by now but it's pretty clear that reviewers don't seem to enable RT when taking power readings on a video card.
That's an assumption on your part and a foolish one at best.

Quit nitpicking.
 
No, but it didn't mention exclusion either and in power testing unless something is specifically excluded, it is to be presumed inclusive.

That's an assumption on your part and a foolish one at best.

Quit nitpicking.

Your first post there is directly contradicted by your second one as you're making an assumption.

You made a claim way back in the thread about RT testing for power and I was trying to see if there's anything to it and how much of a difference it makes. There seems not to be much of one as nobody specifies it. So as I already said: It's irrelevant by now.
 
Same here, I draw the line around 220-230w for any card I'm willing to buy/put in my PC.
Starting from this August our electricity bill will cost double than what it used to, even my low power 12100F+undervolted GTX 1070 system will cost me around 10$/month with my casual use case. 'barely 1-3 hours/day gaming rest is light use'
So yeah this kind of power draw is a big nope for me, most likely I will just upgrade to a 3060 Ti/6700 XT and be done with it.
3060Ti is just a cut down 3070 but pushed really high on the frequency where it pulls ~250watts.

Sold mine and now I run an undervolted 3080 - which consumes 200-270watts. While 3060Ti was always pegging itself at 250watts, the undervolt on it will really lower the perf.

You are better off getting a 3070 and undervolt it even more.
 
Where is that "mainland Europe" where electricity price trippled please? Average EU price was somewhere in 25-30 cents area, in which country would you pay 75-90 cents for a kilowatt - hour???

Plenty of places have spot prices at ~.50€/kWh, and it'll get more expensive this winter.

And as for the Lex arguments: I think he thinks yous guys think enabling RT does not increase power draw, a point no one has made.
 
Okay, this is getting tiresome.

W1zzard states in his testing protocol that Ray Tracing is not used in the gaming tests to show power consumption.

I don't see any other reference to RT in the power graphs.

Therefore, the power draw of Ampere, as shown by TPU charts, does not include the RT component of hardware.

@W1zzard -is this correct?
 
Okay, this is getting tiresome.

W1zzard states in his testing protocol that Ray Tracing is not used in the gaming tests to show power consumption.

I don't see any other reference to RT in the power graphs.

Therefore, the power draw of Ampere, as shown by TPU charts, does not include the RT component of hardware.

@W1zzard -is this correct?
Correct. Power draw is tested with RT off. Also the "regular" game tests are with RT off, so they can be compared to other cards. The RT on tests are on the "Ray Tracing Performance" page.

It doesn't matter anyway. All Ampere cards run in their power limits during gaming, so RT on doesn't increase power draw. You can test this easily, just look at GPU-Z power numbers while playing with the settings
 
Correct. Power draw is tested with RT off. Also the "regular" game tests are with RT off, so they can be compared to other cards. The RT on tests are on the "Ray Tracing Performance" page.

It doesn't matter anyway. All Ampere cards run in their power limits during gaming, so RT on doesn't increase power draw. You can test this easily, just look at GPU-Z power numbers while playing with the settings

That explains the performance hit of RT then, if RT requires power, other processes must be getting slowed down to meet the power envelope?
 
  • Like
Reactions: Lei
It doesn't matter anyway. All Ampere cards run in their power limits during gaming, so RT on doesn't increase power draw. You can test this easily, just look at GPU-Z power numbers while playing with the settings
Ok, so are you saying that a game(any game) running at 75% GPU utilization with RTRT off is not going to increase power usage when RTRT is then enabled? Additionally, with many AIB cards having variable power profiles, are you saying they do not boost the power limits when also boosting the clocks?
 
That explains the performance hit of RT then, if RT requires power, other processes must be getting slowed down to meet the power envelope?
Correct, that can happen, but it's not the reason for the performance hit from RT. Also the perf hit from RT lowers FPS, which frees up some power budget, too.

Ok, so are you saying that a game(any game) running at 75% GPU utilization with RTRT off is not going to increase power usage when RTRT is then enabled?
Not sure about the 75% case (= CPU limited). Test it.

Additionally, with many AIB cards having variable power profiles, are you saying they do not boost the power limits when also boosting the clocks?
Some do, some do not
 
  • Like
Reactions: Lei
Well, regardless of how anyone views this incoming generation, I think we can all agree that the TDP of these GPUs are becoming a bit concerning. In two years from now, are we going to be staring at a 750w RTX 7060? It sure as hell looks that way right now. And I highly doubt that there will be significant improvements to Power Grids around the world, to handle such power draw. Also, there sure as hell won't be a decrease in our monthly bills, regardless!

As much as I'm all for competition between competitive companies, Nvidia, AMD, and Intel need to come to a sort of 'Gentleman's Agreement' (no slight to Lisa Su...for those that really care that much) to draw a line which shouldn't be crossed for power useage. It's all too easy to just say 'If you can't afford the bill, it wasn't made for you!', excusing the idea that there is such a thing as a trickledown effect with GPU performance. This is a matter of efficiency, or the lack of practical application thereof. Lovelace is currently rumored to make Turing seem like a watt sipping lettuce-nibbling puritan. Something has to change soon...

Every subsequent generation seems to be increasing the TDP, all in the name of greater performance numbers, and not necessarily for what we see in front of us on the monitor. This is not far off from what we saw between Pascal and Turing, denoting NVIDIA's struggle to overcome the 1080ti. Of course, this is just one person's thoughts on the matter, not gospel for all.
 
Nvidia fanboys bashing Samsung's 8nm? I was under the impression that they claim it's the best think since sliced bread, because how could they say anything otherwise? The fact is that this node was never meant for high power complex chips like Ampere. It was meant for low power smartphone SoC's like the one i have in my S10e that is based on this same 8nm process. Samsung did scale it up but it was never going to beat TSMC's 7nm in terms of efficiency.

Also part of what makes Ampere look bad is the absurd power consumption if Microns G6X memory. Had they used regular G6 even on the top end models the power consumption would have been more favorable.
Is not that bad chip/process in terms of efficiency, look at 3070. With good undervolting its actually really efficient. As you said GDDR6X is the main culprit here. On my 3080 Ti half of the consumption is just memory... usually 120-130w, that's nonsense. With GDDR6, 512bit bus and not so aggressive voltage these, GA102 cards could be 200-240w and most efficient cards on the market... It was just dumb decision to go with GDDR6X..
1659431086700.png
 

Attachments

  • 1659431079636.png
    1659431079636.png
    26 KB · Views: 57
Is not that bad chip/process in terms of efficiency, look at 3070. With good undervolting its actually really efficient. As you said GDDR6X is the main culprit here. On my 3080 Ti half of the consumption is just memory... usually 120-130w, that's nonsense. With GDDR6, 512bit bus and not so aggressive voltage these, GA102 cards could be 200-240w and most efficient cards on the market... It was just dumb decision to go with GDDR6X..
View attachment 256725
Gddr6+ by Samsung will embarrass gddr6x lol
 
Nice, but, Knowing Nvidia it is possible it will also approach the price of a 3090 ti.
 
I like the idea, I might try it out when I build a HTPC.
(Now with supplies getting better, there could be opportunities to get a good card at discount.)

But I do wonder though, how does this affect the frame rate consistency?


Undervolting is a little more "scary" though. Is it really worth the risk of crashing during gameplay or movies?
I would think that with 25% of the TDP shaved off, the cooler should be easily capable of cooling a card fairly silently, even if it was a higher TDP card than this.
"But I do wonder though, how does this affect the frame rate consistency?"
It's better with UV, because you usually don't hit power limits and temperature limits which can result to frequency drops and hence frame rate consistency.

Undervolting is best thing you can do now with cards on the edge of their stability because of auto oc. There is no risk of crashing during movies because u are changing 3d frequency. Usually you are able to find stable UV during first hour and that' it. Now you have stable card with the same performance which is colder, quieter, has lover consumption and longer life span.
 
Back
Top