• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 2070 Super Gaming Z Trio

Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Have you not read any of the AMD AIB reviews? In every single review it is stated in the cons:

"Large increase in power consumption, power efficiency lost"

W1zzard is merely stating the fact that overclocking AMD cards causes their power efficiency to go to s**t, whereas NVIDIA cards lose only a slight bit off efficiency when OC'd.

But that has nothing to do with how far the card *can* be overclocked, which is what I'm talking about. A card like this one, with a monster cooling and stupid number of VRM phases, exists to be overclocked - so why hobble its potential?

Dude, the card gets 69 C at full load and is more silent than your average case fan. What's the point?! Thát's the point! 29dB is pretty damn awesome. Power limit or no, you can crank up the core voltage and PT to whatever it gives and still keep all, or most of your boost bins too. And still not hear it. I'll take that over the 5% that *might* be in the tank with a higher power limit any day of the week tbh...

This is gonna blow your mind: it's possible to have a really quiet card that also has a higher power limit.

And RTX doesn't make the 20xx series future-proff? Far from it. When RTX becomes "standard" the 20xx RTX will be obsolete unless you own a 2080S/2080Ti. In RTX 30xx you will probably get 2080ti RTX perfomance in the 3060. So how future-proof is the 2060/2070 gonna be in 1-2 years when they are already struggling to do 1080p/60fps in the 3 games that already support RTX.

Back in the days of the original GeForce, hardware transform and lighting was just as contentious a topic as RTRT is now, and for the same reasons. Today, you can't find a modern graphics card without hardware T&L. Can the original GeForce run any of today's games? No, but it was the first consumer product to successfully prove the concept, just as Turing has successfully proved that RTRT is possible.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,417 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
W1zzard is merely stating the fact that overclocking AMD cards causes their power efficiency to go to s**t, whereas NVIDIA cards lose only a slight bit off efficiency when OC'd.

But that has nothing to do with how far the card *can* be overclocked, which is what I'm talking about. A card like this one, with a monster cooling and stupid number of VRM phases, exists to be overclocked - so why hobble its potential?



This is gonna blow your mind: it's possible to have a really quiet card that also has a higher power limit.



Back in the days of the original GeForce, hardware transform and lighting was just as contentious a topic as RTRT is now, and for the same reasons. Today, you can't find a modern graphics card without hardware T&L. Can the original GeForce run any of today's games? No, but it was the first consumer product to successfully prove the concept, just as Turing has successfully proved that RTRT is possible.

If you actually want higher clocks than the MSI youre also hearing 33dB instead if 29. Under the quiet bios the card boosts to within 30 mhz (2-3 bins) higher than the MSI.

So yeah. Apples & apples and the net result is the same
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,798 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Oct 18, 2019
Messages
413 (0.22/day)
Location
NYC, NY
And how many of these games are released yet? I couldn't care less if you showed me a list with 10000 games that are gonna support RTX down the line.
If you announce a new feature and promote it like the best thing since the invention of the wheel and after 13 months you still only have 3 games, how is that a pro since you are paying ekstra for that feature that is not really supported yet?

And RTX doesn't make the 20xx series future-proff? Far from it. When RTX becomes "standard" the 20xx RTX will be obsolete unless you own a 2080S/2080Ti. In RTX 30xx you will probably get 2080ti RTX perfomance in the 3060. So how future-proof is the 2060/2070 gonna be in 1-2 years when they are already struggling to do 1080p/60fps in the 3 games that already support RTX.


Well - the 2060 and 2070 barely outperform the 1080Ti in standard gaming use when pushed to 4K Ultra settings or even 1440p

The 2060 is so weak on Ray Tracing that turning it on depresses framerate severely.

The 2070 is only marginally better.

You don't get solid 4K performance till you step into a 2080 or 2080Ti and even then the difference is narrow while the price is a difference of $400 or higher.

I am DISAPPOINTED by the RTX rollout.

The 2060 should be more powerful than the 1080Ti. Everything after that should be sequentially better.

I dropped $1500 on a 2080Ti FTW3 and recently dropped $1100 on a 2080Ti Black.

I will SKIP the next generation of cards and wait for the "4080Ti".
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
And how many of these games are released yet? I couldn't care less if you showed me a list with 10000 games that are gonna support RTX down the line.
I couldn't care less if you care or not. :)
We would have no RTRT games if Nvidia hadn't provided a GPU that can run them. The hardware had to come first.
And the list, while not very long, is quite impressive.

As for number of games: who said there would be thousands? Why would that ever happen?
We only need the popular AAA titles to convert.
And RTX doesn't make the 20xx series future-proff? Far from it. When RTX becomes "standard" the 20xx RTX will be obsolete unless you own a 2080S/2080Ti. In RTX 30xx you will probably get 2080ti RTX perfomance in the 3060. So how future-proof is the 2060/2070 gonna be in 1-2 years when they are already struggling to do 1080p/60fps in the 3 games that already support RTX.
As I said earlier: hardware had to come first to get this idea rolling.
If you don't like it, don't buy it.
Many people will simply because they're curious, they want the best, they want to taste the future... or they expect to spend 90% time playing Battlefield or Cyberpunk.
 
Joined
Sep 9, 2015
Messages
287 (0.09/day)
Power usage increases quadratically with voltage increase. Nvidia tunes their cards to stay in a most optimal votage/power window. AMD has been forced to eat into that efficiency window to achieve higher clocks for few generations now.

When an AIB card comes with higher voltage out of the box - this is to allow users a higher OC. Users buying AIB cards, especially the pricy ones expect a healthy OC. AMD allows the unlocked power limit and therefore AIB cards can come with higher voltage.

Nvidia on the other hand doesn't as even this gaming Z review shows. MSI could have increased the voltage but that would just eat into the power limit - they are forced to stick to the prescribed voltage/power limit so the cards don't OC too much and eat into the higher tier model.

Hence you can't have your cake and eat it too.
 

omarabbas

New Member
Joined
Oct 25, 2019
Messages
3 (0.00/day)
Would you rather purchase 2080 or wait for this Z trio 2070 super with higher speed ram ??? considering prices will be much closer.
 
Joined
Sep 9, 2015
Messages
287 (0.09/day)
W1zzard is merely stating the fact that overclocking AMD cards causes their power efficiency to go to s**t, whereas NVIDIA cards lose only a slight bit off efficiency when OC'd.

Power scales quadrically with voltage - nvida looses their efficiency just as much when you apply any volts. There is no magical nvidia process is better - its all TSMC. The fact they only give you 1 or 2 additional tiny voltage steps is exactly for that reason. Seems you don't understand this.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
Power scales quadrically with voltage - nvida looses their efficiency just as much when you apply any volts. There is no magical nvidia process is better - its all TSMC. The fact they only give you 1 or 2 additional tiny voltage steps is exactly for that reason. Seems you don't understand this.
Man, you got really excited lately. Nvidia hurt you in any way or what?

First of all: the power scales quadratically with voltage for an isolated, ideal CMOS transistor. There will be a variance in actual applications, because transistors are not the only things graphics cards are made of.

Second: he said "overclocking". And overclocking means working on performance vs frequency.
So there are multiple variables involved, most importantly:
- GPUs may need different voltage bumps to achieve the same gains in clocks
- GPUs may get different performance gains from the extra frequency

And actually AFAIR Nvidia and AMD have never shared the same TSMC node. The last common node design was Samsung 14nm - and even that one ended up different because AMD chips were made by GF and they somehow managed to make it worse.
 
Joined
Sep 9, 2015
Messages
287 (0.09/day)
Man, you got really excited lately. Nvidia hurt you in any way or what?

First of all: the power scales quadratically with voltage for an isolated, ideal CMOS transistor. There will be a variance in actual applications, because transistors are not the only things graphics cards are made of.

Second: he said "overclocking". And overclocking means working on performance vs frequency.
So there are multiple variables involved, most importantly:
- GPUs may need different voltage bumps to achieve the same gains in clocks
- GPUs may get different performance gains from the extra frequency

And actually AFAIR Nvidia and AMD have never shared the same TSMC node. The last common node design was Samsung 14nm - and even that one ended up different because AMD chips were made by GF and they somehow managed to make it worse.

What? - Clearly what I said flew over your head, you managed to miss every single point and now you are trying to give lessons with copy pasta definitions you googled. Bravo and good luck.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
What? - Clearly what I said flew over your head, you managed to miss every single point and now you are trying to give lessons with copy pasta definitions you googled. Bravo and good luck.
What definitions? :eek:
 
Joined
Mar 10, 2014
Messages
1,793 (0.46/day)
Power scales quadrically with voltage - nvida looses their efficiency just as much when you apply any volts. There is no magical nvidia process is better - its all TSMC. The fact they only give you 1 or 2 additional tiny voltage steps is exactly for that reason. Seems you don't understand this.

Heh, apply volts with Nvidia, like you could even do that anymore. Well one could do shunt mod, but even with that cooling needed to be subzero to benefit from higher voltages.

German Tech site computerbase did undervolting test with 2070S and navi chips, I find their findings quite surprising.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,798 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Nov 11, 2016
Messages
3,399 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11

y0y0

New Member
Joined
Jan 26, 2019
Messages
16 (0.01/day)
why did you multiplied surge 2 fps on nvidia cards by 2?

this guy never even reached 100fps let alone 134 average
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,798 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Different test scene, I'm testing in a relatively light area
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,798 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
then how new 1660 super is 2x slower than 1660?
driver bug it seems, mentioned in the text on all surge 2 pages in all 1660S reviews
 

y0y0

New Member
Joined
Jan 26, 2019
Messages
16 (0.01/day)
or there is no driver bug, its just old numbers that are fake, benchmarking "light scene" @ 140 fps when in reality people will have 1/2 of that

different scenes but still cant see 140fps

maybe here? nah, 70-100fps instead of 170, yea its 6700k but you wont get DOUBLE fps with 9900k @ 1440p

2060+8700, eh, still only seeing 1/2 of your fps

ok ok, maybe here, 2080s+8700k 4k, nope, again only 1/2

idk what scene you used if any, but your scene is obviously misleading
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,798 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Interesting, you seem to be correct that there is something wrong with those benchmarks, I'll investigate
 

stmal

New Member
Joined
Oct 30, 2019
Messages
2 (0.00/day)
hi I'm new here. I registered to highlight my concern in the review. Is there an error in the performance per dollar chart. The 2080 and 2080 super has similar price but the chart show that the 2080 was better performance per dollar than 2080 super.
 
Top