• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce GTX 1660 Ti Gaming X 6 GB

Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Joined
Mar 10, 2015
Messages
3,984 (1.13/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Third times a charm?

Heh, maybe. Polaris needs 7nm too get even remotely close in terms of efficiency. Maybe it's best they take both Polaris and Vega out back and shoot them dead.

Hence Navi some time later this year who knows when.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Heh, maybe. Polaris needs 7nm too get even remotely close in terms of efficiency. Maybe it's best they take both Polaris and Vega out back and shoot them dead.

Hence Navi some time later this year who knows when.


What if Navi is just Polaris 7nm with minor tweaks?
 
Joined
May 7, 2014
Messages
59 (0.02/day)
Who said that AMD is a charity company? Doesn't compete with RTX 2080? Have you checked reviews? Techspot measured 7% difference between them based on a 33 game average. And it has double the expensive HBM compared to the RTX 2080's 8 GB. Nearly half of the cost of the Radeon VII comes from the VRAM. At 7:32:

The Radeon VII isn't a bad card but its a bit overpriced, power hungry-noisy and possibily will continue having driver issues in these first months of release. Theres simply better options out there.
For pure gaming the gtx 2080 is a better card overall.
 
Last edited:
Joined
Jul 24, 2009
Messages
1,002 (0.18/day)
The Radeon VII isn't a bad card but its a bit overpriced, power hungry-noisy and possibily will continue having driver issues in these first months of release. Theres simply better options out there.
For pure gaming the gtx 2080 is a better card overall.

RVII has small advantage of being best compute card for money. Only thing that beats it (in FP64) is Titan V. Considerably more expensive..
Sadly it doesnt translate into gaming, somehow (dunno why actually, on paper its great). Titan V is also compute card (its just pro card turned gaming, much like RVII), but its also great gaming card.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
RVII has small advantage of being best compute card for money. Only thing that beats it (in FP64) is Titan V. Considerably more expensive..
Sadly it doesnt translate into gaming, somehow (dunno why actually, on paper its great). Titan V is also compute card (its just pro card turned gaming, much like RVII), but its also great gaming card.
That compute part means nothing except miners. In scientific computing CUDA and Tensor Flow dominates OpenCL and Radeon 7 has bad support for both
 
Joined
Dec 10, 2015
Messages
545 (0.17/day)
Location
Here
System Name Skypas
Processor Intel Core i7-6700
Motherboard Asus H170 Pro Gaming
Cooling Cooler Master Hyper 212X Turbo
Memory Corsair Vengeance LPX 16GB
Video Card(s) MSI GTX 1060 Gaming X 6GB
Storage Corsair Neutron GTX 120GB + WD Blue 1TB
Display(s) LG 22EA63V
Case Corsair Carbide 400Q
Power Supply Seasonic SS-460FL2 w/ Deepcool XFan 120
Mouse Logitech B100
Keyboard Corsair Vengeance K70
Software Windows 10 Pro (to be replaced by 2025)
I like this card, now I just need Nvidia to overstocked this GPU and wait until the market demand is slowing down so I could get one at lower price :laugh:
 
Joined
Jun 28, 2016
Messages
3,595 (1.18/day)
Im not sure why people are so excited.

This is a 1070 with 2GB VRAM removed, over 2 years past its release, at a minor price cut. Also, Turing is not showing to be very inconsistent compared to Pascal. Even the 2060 is jumping all over the place and this 1660ti is anywhere between a 1060 and a 1070ti... I'd grab a 1070 over this any day of the week...

The 1070 launched at about $350... not sure why we are all excited here. I see 1660ti's at the very same price even today.
Simple: because Nvidia managed to improve efficiency even further.
At the same node it's slightly more powerful and slightly less power hungry. So the emitted heat had to drop significantly. And this is the most important gain here.
This means 1660Ti is a chip that can be paired with a small cooler. In fact most companies that announced their lineup have a compact ~18m variant. And the resulting cards are cool and quiet.
Some companies tried their luck with compact 1070 cards, but with mixed success (and none was as quiet as this one).
Think about what this implies for mobile variants. :)

As for the price: you can't expect Nvidia to adjust their pricing to the deals we get from stores for the older product. It can't work that way.
1660Ti card launched at $280, so based on MSRP it's actually way closer to the 1060 6GB ($250) than to 1070 ($380). A theoretical non-Ti 1660 would cost as much as 1060 did. So yes, 1660 is replacing 1060. And yes, it's faster and more efficient than 1070 - a card from a higher segment.
People don't get it because the implementation is different in every game. BFV did reflections, Metro mostly just does a global illumination pass. And that is pretty much all she wrote up until today.
As I said: current RTX implementation and utilization far from what is possible. But it's not a reason to criticize the technology. We'll slowly get there but we need time and chip performance.

Remember RTRT is not a gimmick. It's not something Nvidia created as just another feature. It's not something we could backtrack from because we don't like current results.
Ray Tracing is how photorealistic renders are made. And since we got into 3D gaming, we knew this is how games are going to be rendered in the future.
Now, we could stay on the curve of GPGPU potential and get gaming cards able to do RTRT in 10 years. Or we can utilize purpose-built RT ASIC and get this tech now.

And once again: RTRT doesn't mean games will look more pleasing. The exact opposite is more likely.
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,987 (3.74/day)
Location
London,UK
System Name DarnGosh Edition
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright AM5 Contact Frame + Phantom Spirit 120SE
Memory G.Skill Trident Z5 NEO DDR5 6000 CL32-38-38-96
Video Card(s) Asus Dual Radeon™ RX 6700 XT OC Edition
Storage WD SN770 1TB (Boot)| 2x 2TB WD SN770 (Gaming)| 2x 2TB Crucial BX500| 2x 3TB Toshiba DT01ACA300
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores ლ(ಠ益ಠ)ლ
Hmmm, this and card and the ventus are almost identical. This card makes the gaming X look a little over engineered but then again the much higher power limit probably results in better overclocking, But on the Maximum Overclock Comparison chart, the differences were small... The cheaper Ventus does look to be the better buy here. im not too fussed about the fan not stopping in idle anyway as i always run with a custom fan curve and set fans to run at 30% where its inaudible anyway. I did this with my 1070 Gaming X and current 1080Ti

I think the main deal breaker here is if one prefers the plastic or metal backplate. Lacking RTX/DLSS is also a thing but then again why the hell are you looking to buy an 1660Ti if you want RTX/DLSS support?

Set a custom fan curve on the ventus and save yourself $20
 
Joined
Sep 17, 2014
Messages
22,282 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Simple: because Nvidia managed to improve efficiency even further.
At the same node it's slightly more powerful and slightly less power hungry. So the emitted heat had to drop significantly. And this is the most important gain here.
This means 1660Ti is a chip that can be paired with a small cooler. In fact most companies that announced their lineup have a compact ~18m variant. And the resulting cards are cool and quiet.
Some companies tried their luck with compact 1070 cards, but with mixed success (and none was as quiet as this one).
Think about what this implies for mobile variants. :)

As for the price: you can't expect Nvidia to adjust their pricing to the deals we get from stores for the older product. It can't work that way.
1660Ti card launched at $280, so based on MSRP it's actually way closer to the 1060 6GB ($250) than to 1070 ($380). A theoretical non-Ti 1660 would cost as much as 1060 did. So yes, 1660 is replacing 1060. And yes, it's faster and more efficient than 1070 - a card from a higher segment.

As I said: current RTX implementation and utilization far from what is possible. But it's not a reason to criticize the technology. We'll slowly get there but we need time and chip performance.

Remember RTRT is not a gimmick. It's not something Nvidia created as just another feature. It's not something we could backtrack from because we don't like current results.
Ray Tracing is how photorealistic renders are made. And since we got into 3D gaming, we knew this is how games are going to be rendered in the future.
Now, we could stay on the curve of GPGPU potential and get gaming cards able to do RTRT in 10 years. Or we can utilize purpose-built RT ASIC and get this tech now.

And once again: RTRT doesn't mean games will look more pleasing. The exact opposite is more likely.

Great points thanks for opening my eyes there. As a desktop user my focus is always price/perf, but these are real pros for this card indeed.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
High res texture packs. my 570 has no trouble running them without stutter. Something the 1660 Ti will struggle with even now. I'm so sick of people lapping up getting less memory, whatever. I ain't buying a card with les than 8GB of vram because I'm using almost all of that vram right now.

I believe GamersNexus did a good video on the fallacy of VRAM usage, and explained how we really have no tool to measure how much VRAM a game is actually using. All the tools today simply show how much VRAM is allocated to the game, how much the game is requesting, but not how much it is actually using. That is why we have some games that will use 10GB+ of VRAM if you have it, but still run just fine on cards with 6GB. It is requesting all that VRAM, but not actually using it.
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.18/day)
Hmmm, this and card and the ventus are almost identical. This card makes the gaming X look a little over engineered but then again the much higher power limit probably results in better overclocking, But on the Maximum Overclock Comparison chart, the differences were small... The cheaper Ventus does look to be the better buy here. im not too fussed about the fan not stopping in idle anyway as i always run with a custom fan curve and set fans to run at 30% where its inaudible anyway. I did this with my 1070 Gaming X and current 1080Ti

I think the main deal breaker here is if one prefers the plastic or metal backplate. Lacking RTX/DLSS is also a thing but then again why the hell are you looking to buy an 1660Ti if you want RTX/DLSS support?

Set a custom fan curve on the ventus and save yourself $20
I can't agree with the 30% fans being inaudible. I mean: it's acceptable when you're using the PC (you accept some noise coming from the fans, the keyboard etc).
But there is always a hum and you start to notice it when you're not using the PC: studying, reading a book, sleeping nearby, watching movies etc.
My PC is in my bedroom. It is on almost all the time, but I play for maybe 2h a week. Idle GPU cooling was a must have.

And I'm sure Gaming X cooler is the perfect companion for this chip. It emits even less noise than the Ventus and the fans will stay idle at much higher load.
It's also the version of choice for people that may want to OC.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,972 (2.35/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
The cheaper Ventus does look to be the better buy here. im not too fussed about the fan not stopping in idle anyway as i always run with a custom fan curve and set fans to run at 30% where its inaudible anyway. I did this with my 1070 Gaming X and current 1080Ti
I totally agree. I have mine on a custom curve stating at 28% and don’t hear it when idling. I rather that than let it get up to 50 before any fans start running. Heck I can’t hear it unless I concentrate when it climbs to 55%.
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,987 (3.74/day)
Location
London,UK
System Name DarnGosh Edition
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright AM5 Contact Frame + Phantom Spirit 120SE
Memory G.Skill Trident Z5 NEO DDR5 6000 CL32-38-38-96
Video Card(s) Asus Dual Radeon™ RX 6700 XT OC Edition
Storage WD SN770 1TB (Boot)| 2x 2TB WD SN770 (Gaming)| 2x 2TB Crucial BX500| 2x 3TB Toshiba DT01ACA300
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores ლ(ಠ益ಠ)ლ
I can't agree with the 30% fans being inaudible. I mean: it's acceptable when you're using the PC (you accept some noise coming from the fans, the keyboard etc).
But there is always a hum and you start to notice it when you're not using the PC: studying, reading a book, sleeping nearby, watching movies etc.
My PC is in my bedroom. It is on almost all the time, but I play for maybe 2h a week. Idle GPU cooling was a must have.

And I'm sure Gaming X cooler is the perfect companion for this chip. It emits even less noise than the Ventus and the fans will stay idle at much higher load.
It's also the version of choice for people that may want to OC.

Each to their own. Sound in general or how loud something is purely subjective so something that is inaudible to me is probably a jet engine to you.

I dont have constantly have my ear glued to my PC and my PC is also in my bedroom so im happy with how it sounds. It could be dead silent but then again im not running a passive machine.

In any case, i use an aftermarket cooler with my 1080Ti and its inaudible till i start gaming.


My 30% was just an example. you can set it as low as 5% if thats what you wish.

You're free to disagree all you like but it wont change people's hearing.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,548 (2.89/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3200
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage A pack of SSDs totaling 3.2TB + 3TB HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus TUF P1 mousepad
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
https://www.techpowerup.com/reviews/AMD/RX_480/

Three years this June, one rebrand, one die shrink and it's already in a world of pain.
More like a "die shrink" since the die size and transistor count is still the same. I call RX 590 just a RX 480 rev3.

I believe GamersNexus did a good video on the fallacy of VRAM usage, and explained how we really have no tool to measure how much VRAM a game is actually using. All the tools today simply show how much VRAM is allocated to the game, how much the game is requesting, but not how much it is actually using. That is why we have same games that will use 10GB+ of VRAM if you have it, but still run just fine on cards with 6GB. It is requesting all that VRAM, but not actually using it.
IIRC Mirror's Edge Catalyst "needs" 8GB VRAM on Hyper settings, but I had no problems running it with a 3GB (or "3.5GB") or 4GB card (780, 780 Ti, 970 SLI or 980) at smooth 60fps in all times
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,987 (3.74/day)
Location
London,UK
System Name DarnGosh Edition
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright AM5 Contact Frame + Phantom Spirit 120SE
Memory G.Skill Trident Z5 NEO DDR5 6000 CL32-38-38-96
Video Card(s) Asus Dual Radeon™ RX 6700 XT OC Edition
Storage WD SN770 1TB (Boot)| 2x 2TB WD SN770 (Gaming)| 2x 2TB Crucial BX500| 2x 3TB Toshiba DT01ACA300
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores ლ(ಠ益ಠ)ლ
I totally agree. I have mine on a custom curve stating at 28% and don’t hear it when idling. I rather that than let it get up to 50 before any fans start running. Heck I can’t hear it unless I concentrate when it climbs to 55%.

At 50'c my GPU fans are already hitting 45% where it will stay till 60'c. I could probably run it at a slower 40% but i want it to hit the maximum boost clocks when gaming. I dont think ive seen it come anywhere close to 60'c or above so far so maybe my fan profile is a little aggressive. I just dont want the VRMs to overheat and blow up as i dont quite trust the heatsinks i glued on there with thermal adhesive.
 
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
No. It didn't. For whatever reason, TPU seems to insist on using reference cards as the benchmark, which is asinine when doing a comparison like this. This isn't a reference card, so why would you compare it to other reference cards? Your conclusion, and TPU's conclusion about Vega 56 running louder and hotter is, quite frankly, incorrect when doing a more apples-to-apples comparison. Noise at idle for my Red Dragon Vega 56 is also 0. Noise at load is extremely quiet (inadubile from where I sit). Temps at load are virtually identical to the 1660Ti. As for performance, you see that in GN's review (starting with the F1 benchmarks as Steve also uses a Red Dragon)

The only real advantage of the 1660Ti over Vega 56 is power usage. But honestly, and this has been said many times over, most of us aren't gaming 24 hours a day. As far as I'm concerned, for real-world usage, the difference is inconsequential. Of course, that's assuming Vega 56 does indeed drop in price across the board. For the same price, I'd opt for my card over any 1660Ti all day, every day. But if Vega 56 stays $400+ then there's no way to recommend it (I got mine for just over $300 about 6 months ago). I certainly wouldn't recommend a reference Vega 56. Ever.

I know how cards are tested. The problem is, if you're comparing a reference card to non-reference models, then you're going to come up with, at best, inaccurate or incomplete conclusions (such as, "Vega 56 runs much hotter and noisier than GTX 1660 Ti" which is both true and untrue based on what you're comparing) . I have nothing against Wizz including reference cards in the benchmark suite (and I think it is quite helpful actually), but there should also be a non-reference card included for comparison if you're going to include those observations in the conclusion. Without it, you're basing your conclusion on incomplete data which does a disservice to the reader. I know that this has been discussed here before as well and that, mostly likely, nothing is going to change so I am going to stand by my "asinine" assertion.

1. TPU uses what they are sent.
2. The Red Dragon is $190 more than the MSI Gaming X, 1660 Ti what was it you said about apples and apples ? $310 versus $500 ? ? ?
3. Id be happy to read a review of the Dragin on a reputable site ... that category doesn't include Jay2Cents, GamersNexus and most other youtubers. Unless done by the same reviewer can't compare.
4. Power is not the only thing, at least in and of itself, add the cost of a 100 watt bigger PSU and an extra case fan and that $190cost difference just went to $225 or so.

So let's look at other AIB Vega 56......

\https://www.hardocp.com/article/2018/03/05/asus_rog_strix_rx_vega_56_o8g_gaming_review/15
+ 127 watts over the Reference overclocked, 96 watts box
+ 4C over refernce, 80C out of box temps

https://www.tomshardware.com/reviews/gigabyte-radeon-rx-vega-56-gaming-oc-8g-review,5413-5.html
THG also got 76C with the Gigabyte AIB Vega 56
THG also noted that this model was heavily tuned for noise reduction and still banged out 40.8 dB(A)

In short, without a published reference from a , it's hard to accept subjective observations, other reviews of AIB Vega 56's contradict what you are observing. PowerColot is going to have to cut their price on the Red Dragon in half to be relevant in today's market.

If you'd like, I'm sure if you sent your $500 card to Wiz, he'd test it for you .... he can only test what the postman brings him so don't see how he's to blame here. Also look around and ask yourself this question: "Why are AIB Vega reviews so hard to find ? The ones we can find do not support your conclusions, actually contradicting them. So you can't stand by what your "conclusions" if you want, but the available data in no way supports the position yu have taken.

you're probably right, maybe i should just get a 1660 ti

if you look at TPus review of the 3GB and 6GB 1060s, you will see that VRAM has no impact on performance at 1080p. The 6 GB model has more shaders (11% extra) so it does have a performance edge of about 6% .... but if VRAM was in play, in any conceivable way whatsoever ... then that gapo would have to widen at 1440p ... it does not. I have seen a game or 2 have lower performance that might be related to VRAM, but not conclusively as yet where other factors could be ruled out.

1080p = 3 GB Minimum / 4 GB recommended
1440p = 5 GB Minimum / 7 GB recommended
2160p = 12 GB Minimum / 16 GB recommended
2880p = 20 GB Minimum / 29 GB recommended
 
Last edited:

M2B

Joined
Jun 2, 2017
Messages
284 (0.10/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
1080p = 3 GB Minimum / 4 GB recommended
1440p = 5 GB Minimum / 7 GB recommended
2160p = 12 GB Minimum / 16 GB recommended

It doesn't work like that.
Having 6GB of VRAM for 1440p gaming is actually better than having 4GB for 1080p gaming.
It doesn't scale like that. Going from 1080p to 1440p adds around 400-700MB to the VRAM usage and going from 1440p to 4K adds another 800MB-1GB to the VRAM usage which means 6GB at 4k is roughly as good as 4GB at 1080p, maybe a little better.
 
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
Dude you know all too well some people buy GPUs with hearts and emotions right? For some folks it is a statement of “no” to “evil business practice”. Or out of true love to a specific brand. We gotta respect that man!

As to the respect, I have no respect for myself.... long been a "hardware whore". I'm like the girl who goes to see her BF race, and after he loses his car, I go home with the guy who beat him :)

As to the evil business practices, welcome to capitalism. It's not as if any corporation would do differently in their position. Winning has it's advantages.


The Radeon VII isn't a bad card but its a bit overpriced, power hungry-noisy and possibily will continue having driver issues in these first months of release. There's simply better options out there. For pure gaming the gtx 2080 is a better card overall.

Well said, there are other cards that have dimilat performance, they are just priced well above them. Pricing the VII a bit less than the upcoming 2070 Ti might work but I think to sell it will need to be closer to $550.

Unfortunately AMD's market position is a ball and chain ... with their small market share, and burdened by the console market committments, hard to catch up. Smartest thing nVidia ever did was leave that market. It's getting to the point where if they were head to head on proce / perofrmance ... it wouldn't matter. Power, temps and noise affect buying choices these days so even if they can deliver same performance per dollar, they will have to sell cheaper to offset those.



It doesn't work like that.
Having 6GB of VRAM for 1440p gaming is actually better than having 4GB for 1080p gaming.
It doesn't scale like that. Going from 1080p to 1440p adds around 400-700MB to the VRAM usage and going from 1440p to 4K adds another 800MB-1GB to the VRAM usage which means 6GB at 4k is roughly as good as 4GB at 1080p, maybe a little better.

I have not seen it, looking at 100s of sites and comparisons with 5xx, 6xx, 7xx and 9xx cards, outside of a poor console port or other anomaly, have never seen any instance where 3 GB was not enough for 1080p. Like the old more than 1.50 volts for DDR3 or 1.35 for DDR4 will void your warranty, the supposition exists despite Intel's published statements to the contrary. No mater how many times we she published results not supporting the need for more than 3 GB VRAM @ 1080p, the tst results never support it.

Video Card Performance: 2GB vs 4GB Memory - Puget Custom Computers
Is 4GB of VRAM enough? AMD’s Fury X faces off with Nvidia’s GTX 980 Ti, Titan X | ExtremeTech
GTX 770 2GB vs 4GB

Back in the day, IIRC the formula was resolution x color depth / 8. Nowadays, yes it doesn't scale directly however, the level of conservatism should go up in proportion to the investment.

Th simple fact ids TPUs test results here clearly show there is no difference at all in performance between the 3GB or 6GB 1060 Ti at 1080p or 1440p. We so no difference in performance by increasing resolution until we get to 2160p. So if the VRAM is not a factor at 1440p, then it's certainly not a factor at 1080p
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.18/day)
Each to their own. Sound in general or how loud something is purely subjective so something that is inaudible to me is probably a jet engine to you.
(...)
You're free to disagree all you like but it wont change people's hearing.
Exactly!
That's why fans turning off is such a terrific feature of graphic cards. They become objectively inaudible.
Some people won't hear silent fans, some will. You never know. And your perception may also change with time.

No so long ago, when custom PCs were louder in general, there were major PC sites devoted to building silent (passively cooled if possible) computers. It was both a popular hobby and a necessity.

Since then, components certainly got quieter: fans became larger, watercooling stopped being something extreme that you use gardening materials for and pray for no leaks.
On the other hand: we didn't really get to the passive or almost passive quality so many hoped for. Obviously, a lot of people moved to notebooks, but surely not everyone.

The situation today is that unless one spends a fortune on cooling, a custom PC will be louder than typical OEM Office PCs and Macs. It wasn't true in the 90s and IMO it shouldn't be now. I think we've settled for less than we should have...
 
Joined
Jun 28, 2016
Messages
3,595 (1.18/day)
All this vram talk made me wonder if there's is way to restrict cards vram size by registry trick or some other way. At least on linux and wine there were that old wined3d registry key for vram size. Only thing I could find were this nvidia devtalk thread for limiting vram through cuda:

https://devtalk.nvidia.com/default/topic/726765/need-a-little-tool-to-adjust-the-vram-size/
You can simply run a program on the GPU and allocate memory. The game process won't kill the other program and will use only what is left.
 
Top