• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce GTX 1660 Ti Gaming X 6 GB

Joined
Feb 23, 2008
Messages
1,064 (0.17/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Wow, somehow I missed that Civ6 supports DX12. Was that added in a later patch? Will use DX12 starting next rebench
Hey W1zzard, would it be feasible to have a separate performance graph for DX12 games only? Might I also suggest some kind of was per-game indication (a small icon or something) t oshow if a given title is AMD or nVidia sponsored?
 
Joined
Jun 16, 2016
Messages
409 (0.13/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
This card is incredible, we are now getting 980 Ti performance at 960 prices. I can't believe how quickly the mid range cards became capable at 1440p and even 4k. You could easily use this card to match an Xbox One X.

Hey W1zzard, would it be feasible to have a separate performance graph for DX12 games only? Might I also suggest some kind of was per-game indication (a small icon or something) t oshow if a given title is AMD or nVidia sponsored?

What difference is there if a game is Nvidia or AMD sponsored. It's not like people are going to pick what games to play based on whose logo shows up in the splash screen. And no fudging the numbers will save the Vega 56 or the 590, this card is better in every way.

EDIT: Yes, I meant Xbox One X.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
This card is incredible, we are now getting 980 Ti performance at 960 prices. I can't believe how quickly the mid range cards became capable at 1440p and even 4k. You could easily use this card to match an Xbox One S.

What difference is there if a game is Nvidia or AMD sponsored. It's not like people are going to pick what games to play based on whose logo shows up in the splash screen. And no fudging the numbers will save the Vega 56 or the 590, this card is better in every way.

Yeah it's pretty brutal, 2.5 times more efficent than the RX 590, whilst being way faster. If you'd just landed from another planet you'd think the 590 was released three years ago... not 3 months ego. Ouch.
 
Joined
Jun 28, 2016
Messages
3,595 (1.18/day)
People might call me a troll but lack of RTX and DLSS is hardly a negative at this current juncture of all junctures.

The real negative is when you turn RTX/DLSS on and it eats more than half your frames and looks almost identical to ultra quality settings at 1440p. All that extra money spent on an RTX card when all you needed was to run 1440p or 4k resolutions
You do understand RTX is not meant to make the picture look better, right? It makes the picture look more realistic.
It does seem like people don't get what RTRT is doing after all these reviews and articles we got since RTX launch.
 
Joined
Sep 17, 2014
Messages
22,282 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Im not sure why people are so excited.

This is a 1070 with 2GB VRAM removed, over 2 years past its release, at a minor price cut. Also, Turing is not showing to be very inconsistent compared to Pascal. Even the 2060 is jumping all over the place and this 1660ti is anywhere between a 1060 and a 1070ti... I'd grab a 1070 over this any day of the week...

The 1070 launched at about $350... not sure why we are all excited here. I see 1660ti's at the very same price even today.

You do understand RTX is not meant to make the picture look better, right? It makes the picture look more realistic.
It does seem like people don't get what RTRT is doing after all these reviews and articles we got since RTX launch.

People don't get it because the implementation is different in every game. BFV did reflections, Metro mostly just does a global illumination pass. And that is pretty much all she wrote up until today.
 
Last edited:
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
People might call me a troll but lack of RTX and DLSS is hardly a negative at this current juncture of all junctures.

The real negative is when you turn RTX/DLSS on and it eats more than half your frames and looks almost identical to ultra quality settings at 1440p. All that extra money spent on an RTX card when all you needed was to run 1440p or 4k resolutions

I think to look at this objectively and without exaggerating the performance hit (30 - 50%) w/ biggest hit at highest resolutions

At 4k, we are talking 1.5% of the market, 95% of which is at 60 Hz. While I can understand the position, Id rather not lose 30-40% of my performance to RT, half the games in TPUs test suite are doing 90 or better fps at 4k.... so if two brothers have same system except for GFX card.

The nVidia 2080 Ti user will have half 90% of his games capped at 60 fps due to monitor limitations, but for 10 of those games, he can turn on RT with little or no penalty, still being above his monitor's limitation. The AMD Radeon VII user will have a bit less of his games capped at 60 fps due to monitor limitations because the Ti is 40% faster. So the point is .... we shouldn't base what card to buy based upon someone who can take advantage of things many can't.

At 1440p, adding up all the fps for games in TPUs test suite the $700 RTX 2080 is 13.3% faster than the $700 Radeon VII. With both cards overclocked, that grows to 21.8 %. So what's the downside of RT ? Let's look at the options here .... $700 Radeon 7 OC'd versus $700 RTX 2080 OC'd

Now at this point in time there's not a lot of games that support it so this is purely conjectional in that we must assume at some point that a % of the games (say 20 - 30% for sake of argument). From Metro article conclusions, Wiz puts the expected hit at 30 - 40% once it's out a bit and tweaked, Ill use 35%. So lets assume for example, that some developers release updated version of their games and I'll pick 25% of the games on the list ... numbers are 1440p w/ both cards overclocked.

Divinity OS2 could be played at 119.0 fps on an OC'd VII, 153.1 on a 2080 or 99.5 on a 2080 w/ RT enabled.
F1 could be played at 133.3 fps on an OC'd VII, 157.9 on a 2080 or 102.7 on a 2080 w/ RT enabled.
GTAV could be played at 150.9 fps on an OC'd VII, 184.3 on a 2080 or 119.8 on a 2080 w/ RT enabled.
Witcher 3 could be played at 103.2 fps on an OC'd VII, 127.0 on a 2080 or 82.5 on a 2080 w/ RT enabled.

Now think of that from a perspective of folks playing at 65, 100, 120 Hz monitors. It's an option and it doesn't cost you a dime. Now if ya built ya computer so you can brag about how many fps you get, fine. But, if you are looking from the perspective of the gaming experience, frankly I don't think that I'd care if it was on or off for 3 of those games. I's take the extra 20-30 fps and enjoy ULMB. However, on Witcher 3, if it came down to playing on a Radeon VII at 103.2 versus having the choice to play at 127.0 with ULMB or 82.5 w/ RT and ULMB, I'd like to experience the latter.

So again, let's look at the options here .... $700 Radeon VII OC'd versus $700 RTX 2080 OC'd

1. Of the 21 games in the test suite, only 1 game under 80 fps, with the 2080 (3 for the Radeon VII) which means turning anykinda-Sync off as using Motin Blur reduction is an option ONLY on the 2080.
2. Of the 21 games in the test suite, with both cards overclocked, 2080 is faster in 19 of them Radeon VII in 2 of them.
3. Overall, the 2080 is 22% faster with both cards OC'd
4. So far .... is RT even a factor in the decision here ?
5. No one is mandating you to use it ... what is the downside ?

It's like going down to buy a new SUV .. and the salesman says, hey ya know what ... "I can sell ya the 2WD model you came here for ... but for the same price I can give ya the RT model and it comes with 4WD, larger more efficient engine means it accelerates faster and uses less gas, comes standard with AC, and runs cooler" and turning it down cause the carpeting in the trunk is red instead of green.

Was not so long ago, that I was saying "the 780 OCd is faster than AMDs offering OCd but below that, weigh your options. Then it was, "Well from the 970 price point on up, nVidia has the edge but below that look at both cards in each price niche ... " And then it was "Well from the xx60 price point on up ...". Saying it isn't so, doesn't change the numbers.

In short, at the $700 price point, like the $300 price point, AMD doesn't really have a horse in the race. Not having RT at this price point is not a deal killer, because no one else has it either. But it does have ULMB which is certainly a significant incentive. And at the upper price tiers, providing the option is an incentive, not as much as ULMB but any tech that gives the user different options to enhance the user experience is a good thing.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
You could easily use this card to match an Xbox One S.

You mean Xbox One X?
The Xbox One S is so damn weak, a crappy GTX 750 Ti can match it.
 

GloryToYou

New Member
Joined
Jan 4, 2019
Messages
18 (0.01/day)
Ugh. yeap, ofc, great value!

P.S. It's just another poorly priced card for over 300$ - think people here forget about IRL pricing, just as always. 2080 Ti should've been 999$ aswell....
1660 Ti Already for 450$ here, in local stores
The cards are on newegg right now, and listed in stock. This isn't some paper launch. Most of them are the $280 list price.
https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-506

Newegg is listing this gaming X at $10 over its $300 MSRP, but the rest are pretty much in line.
 
Last edited:
Joined
Jun 28, 2018
Messages
299 (0.13/day)
Yeah it's pretty brutal, 2.5 times more efficent than the RX 590, whilst being way faster. If you'd just landed from another planet you'd think the 590 was released three years ago... not 3 months ego. Ouch.

From Anandtech review:

Performance has gone up versus the GTX 1060 6GB, but card power consumption hasn’t. Thanks to this, the GTX 1660 Ti is not just 36% faster, it’s 36% percent more efficient as well.
The other Turing cards have seen their own efficiency gains as well, but with their TDPs all drifting up, this is the largest (and purest) efficiency gain we’ve seen to date, and probably the best metric thus far for evaluating Turing’s power efficiency against Pascal’s.

https://www.anandtech.com/show/13973/nvidia-gtx-1660-ti-review-feat-evga-xc-gaming/16

These are quite impressive values, considering that we are talking about GPUs with basically the same node.
 
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
Except VRAM.

Anyone who buys this card to play at 2160p where more than 6GB is needed is making a mistake.

https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html

The 1060 6 GB is faster than the 3 GB because it has 11% more shaders. If VRAM was in play here in any conceivable way here as you imply, we should see a massive hit on performance going from 1080p to 1440p. Instead what do we see ?

The 1060 6GB is 6 % faster than the 1060 GB at 1080p. Obviously, if you presumption is correct this should be what ?...10%, 12% at 1440p ? Nope, just the same 6% as at 1080p. The extra 3 GB had no impact on performance.

What else do we see ? The 3 GB 1060 is as fast as the 8 GB 480


Im not sure why people are so excited. The 1070 launched at about $350... not sure why we are all excited here. I see 1660ti's at the very same price even today.

The MSI Gaming X 1070 was $449 MSRP at launch here in US, ... took a while for prices to settle don to that level tho. Bought one for my son, receipt says $429 (5 months after releaseThe 970 was about $350. The MSI 1660 Ti is $310 today, the Ventus is $279. So $449 to $309 is significant ... and that $309 includes tariff and 1st day pricing. The MSI Gaming X cards are usually 1st ones out of stock ....

The 2080 Gaming X have been running about $100 over the least expensive cards, the 2080 Ti Gaming X's are $250 or more higher than competititve offerings.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Hey mighty @W1zzard , does this GPU also supports the new Nvidia OC scanner overclocking?
 
Joined
Jan 4, 2018
Messages
155 (0.06/day)
Joined
Jan 19, 2018
Messages
184 (0.07/day)
Processor AMD 5800X
Motherboard MSI X570 Tomahawk
Memory G.Skill 32GB
Software Windows 10
Looking comparatively.

Performance (1440p): Vega 56 has a 3% edge in the " out of the box" performance test, but 1660 Ti gains 9.6%, vega doesn't do well here, Edge =>1660 Ti .. at 1080P even more so.
Power Usage: 141 under peak gaming for the Ti; 237 for Vega 56, Edge (59%) =>1660 Ti
Noise @ Idle: 0 dbA for the 1660 Ti, 25 for the Vega 56, Edge (Infinity) =>1660 Ti
Noise @ load: 32 dbA for the 1660 Ti, 42 for the Vega 56 making it twice as loud, Edge (50%) =>1660 Ti
Temps @ Load: 68C for the 1660 Ti, 75 for the Vega 56, Edge (91%) =>1660 Ti
Price(newegg MSI Gaming X): $310 for the 1660 Ti, $400 for the Vega 56, Edge (77%) =>1660 Ti

The Vega 56 just ceased to be relevant in any way. To get sold, it needs to drop in price 35% to $259
No. It didn't. For whatever reason, TPU seems to insist on using reference cards as the benchmark, which is asinine when doing a comparison like this. This isn't a reference card, so why would you compare it to other reference cards? Your conclusion, and TPU's conclusion about Vega 56 running louder and hotter is, quite frankly, incorrect when doing a more apples-to-apples comparison. Noise at idle for my Red Dragon Vega 56 is also 0. Noise at load is extremely quiet (inadubile from where I sit). Temps at load are virtually identical to the 1660Ti. As for performance, you see that in GN's review (starting with the F1 benchmarks as Steve also uses a Red Dragon)

The only real advantage of the 1660Ti over Vega 56 is power usage. But honestly, and this has been said many times over, most of us aren't gaming 24 hours a day. As far as I'm concerned, for real-world usage, the difference is inconsequential. Of course, that's assuming Vega 56 does indeed drop in price across the board. For the same price, I'd opt for my card over any 1660Ti all day, every day. But if Vega 56 stays $400+ then there's no way to recommend it (I got mine for just over $300 about 6 months ago). I certainly wouldn't recommend a reference Vega 56. Ever.
 
Joined
Dec 27, 2013
Messages
887 (0.22/day)
Location
somewhere
Anyone who buys this card to play at 2160p where more than 6GB is needed is making a mistake.

https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html

The 1060 6 GB is faster than the 3 GB because it has 11% more shaders. If VRAM was in play here in any conceivable way here as you imply, we should see a massive hit on performance going from 1080p to 1440p. Instead what do we see ?

The 1060 6GB is 6 % faster than the 1060 GB at 1080p. Obviously, if you presumption is correct this should be what ?...10%, 12% at 1440p ? Nope, just the same 6% as at 1080p. The extra 3 GB had no impact on performance.

What else do we see ? The 3 GB 1060 is as fast as the 8 GB 480

High res texture packs. my 570 has no trouble running them without stutter. Something the 1660 Ti will struggle with even now. I'm so sick of people lapping up getting less memory, whatever. I ain't buying a card with les than 8GB of vram because I'm using almost all of that vram right now.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
High res texture packs. my 570 has no trouble running them without stutter. Something the 1660 Ti will struggle with even now. I'm so sick of people lapping up getting less memory, whatever. I ain't buying a card with les than 8GB of vram because I'm using almost all of that vram right now.

Nvidia has traditionally better VRAM compression technology. I would say that 2GB difference is negligible if it is just HD textures at 1080p.
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
The only real advantage of the 1660Ti over Vega 56 is power usage. But honestly, and this has been said many times over, most of us aren't gaming 24 hours a day. As far as I'm concerned, for real-world usage, the difference is inconsequential. Of course, that's assuming Vega 56 does indeed drop in price across the board. For the same price, I'd opt for my card over any 1660Ti all day, every day. But if Vega 56 stays $400+ then there's no way to recommend it (I got mine for just over $300 about 6 months ago). I certainly wouldn't recommend a reference Vega 56. Ever.

To think, there was a time when AMD would actually waste time and money to mock Nvidia when it was their "only real advantage", oh how times have changed.

 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,972 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
For whatever reason, TPU seems to insist on using reference cards as the benchmark, which is asinine when doing a comparison like this.
Let me school you, since your “assenine” comment about the testing here shows you have not been frequenting TPU reviews very long.

As with most reviewers, most cards are not W1zzard’s to keep. Since he has one of the most extensive testing suites in the industry, including both games and multiple previously released cards, he keeps a baseline for comparisons in the future using reference cards.

It’s up to you to use a little common sense and review knowledge of where cards lie on the stack to extrapolate where a particular card you are interested in would lie.

Please be aware, W1zzard already retests every card he keeps in the stable, on each and every game in the testing suite. Each time he tests a new card for review. He already gives literally months of his life in the testing lab each year.

For you to think he should keep a high count of additional cards in stock and test those as well is rude, self-centered, and quite frankly, to use your term, assenine.
 
Joined
Jun 1, 2011
Messages
4,562 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
I'm going to get one to see if it can play Crysis....
 
Joined
Jan 19, 2018
Messages
184 (0.07/day)
Processor AMD 5800X
Motherboard MSI X570 Tomahawk
Memory G.Skill 32GB
Software Windows 10
Let me school you, since your “assenine” comment about the testing here shows you have not been frequenting TPU reviews very long.

As with most reviewers, most cards are not W1zzard’s to keep. Since he has one of the most extensive testing suites in the industry, including both games and multiple previously released cards, he keeps a baseline for comparisons in the future using reference cards.

It’s up to you to use a little common sense and review knowledge of where cards lie on the stack to extrapolate where a particular card you are interested in would lie.

Please be aware, W1zzard already retests every card he keeps in the stable, on each and every game in the testing suite. Each time he tests a new card for review. He already gives literally months of his life in the testing lab each year.

For you to think he should keep a high count of additional cards in stock and test those as well is rude, self-centered, and quite frankly, to use your term, assenine.
Actually I've been reading TPU for years, I simply started commenting recently.

I know how cards are tested. The problem is, if you're comparing a reference card to non-reference models, then you're going to come up with, at best, inaccurate or incomplete conclusions (such as, "Vega 56 runs much hotter and noisier than GTX 1660 Ti" which is both true and untrue based on what you're comparing) . I have nothing against Wizz including reference cards in the benchmark suite (and I think it is quite helpful actually), but there should also be a non-reference card included for comparison if you're going to include those observations in the conclusion. Without it, you're basing your conclusion on incomplete data which does a disservice to the reader. I know that this has been discussed here before as well and that, mostly likely, nothing is going to change so I am going to stand by my "asinine" assertion.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Get whatever makes you happy. Be it Vega of Polaris or this.

Indeed, I mean according to this very review the MSI GTX 1660 Ti is only 42% faster than a RX 580 8GB @1080P whilst being twice as efficient too, I guess I can see the appeal of an 8GB Polaris card over it...
 
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
The power efficiency of this card is tremendous. Just imagine what Turing on 7nm could be.

I would want them to crank the clocks up.

If you'd just landed from another planet you'd think the 590 was released three years ago.

In all honesty, it was probably released longer ago than that. When was the 480 released?
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Indeed, I mean according to this very review the MSI GTX 1660 Ti is only 42% faster than a RX 580 8GB @1080P whilst being twice as efficient too, I guess I can see the appeal of an 8GB Polaris card over it...

Dude you know all too well some people buy GPUs with hearts and emotions right? For some folks it is a statement of “no” to “evil business practice”. Or out of true love to a specific brand. We gotta respect that man!

I would want them to crank the clocks up.



In all honesty, it was probably released longer ago than that. When was the 480 released?

And they probably will crank the clock even higher.
 
Top