# EVGA GTX 1070 Ti FTW2 iCX 8 GB



## W1zzard (Nov 24, 2017)

The EVGA GTX 1070 Ti FTW2 features the company's iCX technology, which provides a total of nine thermal sensors located at important positions to always have an overview over your card's temperatures. EVGA also upgraded the VRM circuitry to 10+2 phases with two 8-pin power inputs.

*Show full review*


----------



## bug (Nov 24, 2017)

How on Earth is this worth a 9.3 when it has worse power draw than a reference card with nothing to show for it? I don't remember when was the last time EVGA dropped the ball so hard.
Also, would you be willing to look into what this card can do with a single 8pin? Does it boot/run Windows/finish a 3DMark test?


----------



## Nihilus (Nov 25, 2017)

It draws a whopping 15w over stock.  I would say it is still a good value considering the good temps, low noise, and great overclocking.

As for the 1070ti, it is clear that this card makes more sense to purchase than the 1080.  O/c vs O/c, the 1070ti is a match to the 1080 it seems and costs a little bit less.  It will also have better resale considering the better mining performance.


----------



## The Quim Reaper (Nov 25, 2017)

This, like all 1070Ti's, is not worth a penny over $400, £380, €420


----------



## jabbadap (Nov 25, 2017)

bug said:


> How on Earth is this worth a 9.3 when it has worse power draw than a reference card with nothing to show for it? I don't remember when was the last time EVGA dropped the ball so hard.
> Also, would you be willing to look into what this card can do with a single 8pin? Does it boot/run Windows/finish a 3DMark test?



Well it has superior cooler: lower temps and load noise is the same as FEs idle noise. In the end power draw is quite meaningless factor between different gtx1070tis, they are all forced to keep reference clocks so it's only matter of binning and given voltage how much power it takes. This EVGA most probably could keep the same clocks with under volting it and thus lower power draw than FE.


----------



## pat-roner (Nov 25, 2017)

With all 1070 ti having the exact same clocks and stock performance, i think it's a big waste to include stock results. I think OC results should be included in all benchmarks.


----------



## Tsukiyomi91 (Nov 29, 2017)

out of all the gaming benches, this card only beaten the Vega 64 in a handful of games. Would pick the 1080 & be done with it since the price range is quite close with the 1070Ti.


----------



## bug (Nov 29, 2017)

Tsukiyomi91 said:


> out of all the gaming benches, this card only beaten the Vega 64 in a handful of games. Would pick the 1080 & be done with it since the price range is quite close with the 1070Ti.


Well, the 1070Ti is an answer to Vega56. So that's what it does.


----------



## coonbro (Apr 6, 2018)

''How on Earth is this worth a 9.3 when it has worse power draw than a reference card with nothing to show for it? I don't remember when was the last time EVGA dropped the ball so hard.''

look at all that sales hype .    


''With all 1070 ti having the exact same clocks and stock performance, i think it's a big waste to include stock results ''

you notice this with 10 series   like evga   use to be you got  something for going with a classy over a SC     now you really don't  maybe a classy has  17 MHz over the SC   ..lol.....   when it use to be like a classy gave 90 MHz over the SC   line of the same card .. 

today its all about cheap thrills that make strong sale points  like crazy logos,  led lighting  ,  coolers on these so called more  efficient  cards that should not be getting that hot anyway if there not burning the bigger power  to run them .. ect...  

if you get away from the hype and look at the facts  this stuff is a joke  to the point I never ''upgrade''  unless something just totally fails  . then hate to do it then .   I never had less interest in doing a build  anymore  .   a lot of useless junk out there  that really  don't support anything less then win-10 and the latest stuff  .  my hardware runs XP, vista ,7 ,8 ,Linux , CRT monitors native ,ect... what ever  .  this great new stuff cant  for more cost ..


----------



## newtekie1 (Apr 6, 2018)

It isn't just hype, the biggest thing is the TDP limit increase over the stock card.  This will allow the card to overclock higher than the stock card.  And before you say "but the overclock results are the same as the stock card", read the review.  The overclocking results are done with the stock Power, Thermal, and Fan limits.  So, yes, overclocking will be roughly the same as the stock card in that case, because the power limits the clock speed with these cards.

But because you can raise the power limit on this card more than the stock card, it should in theory reach a higher clock speed once you do that.  But W1z does not test overclocks in this situation.


----------



## coonbro (Apr 29, 2018)

I think this is the only review site that  points this out  in thumbs down 

''No analog (VGA) support''

its so much easyer for you to now support them  instead of them supporting you and your needs  ..  [at the same prices or more then cards that did ...lol... ]
how hard is a dvi -I  to keep then add that dvi-d ?   then your not looking to ditch your great gaming -0- lag  CRT monitor for a new monitor that may support the card   [that you did not need , but now you do ..lol.. ]   lets see that 400$ + new monitor is  now a 600$ investment  ..   or your buying a extra crap adaptor that ''may'' work ?

whats next no support for anything under 1080p  @144hz ?    ya , you now build to fit there needs not yours anymore ..lol....  this custom build stuff has become more a joke and a sham / scam


----------



## jabbadap (Apr 29, 2018)

Well yeah Pascal's video output hardware has no RAMDAC so no analog signals are possible with them(So no DVI-I in any of current gen. discrete graphics card on neither side. Amd have dropped it even before nvidia). If one want to use VGA monitors/VGA Projectors on a new discrete graphics card, one must buy active adapter with DAC. And yeah that adapter will add lag.

It as a Con. well it has been talked to death on tpu, so I'm not really bother to carry on with that anymore. In short it is wise to mention there's no VGA support. But if I remember correctly W1zzard have said it has no effect on scoring.


----------



## bug (Apr 29, 2018)

By that logic, it may be worth mentioning it doesn't have composite, S-Video or SCART outputs either


----------



## coonbro (Apr 29, 2018)

well it gets to the point that upgrading hurts and cost more then it helps .   you got all that good working hardware and software that's now rendered useless  over a crap china GPU  @ more then a older card cost that did it all  .  then I guess they figure fools and there money are soon parted  cause WOW its got them RBG led lighting  ..lol..  whats now considered a strong selling point  .  thats when you know its all about sales hype  and gimmicks not function or support .   you buy this stuff today and you cant depend on anything you have from anytime in the past to fully work if it works at all  .

heck . with that is why I don't do Microsoft  primary you BUY at a high price for software for windows that may not even now work under latest windows and your back rebuying . funny nothing wrong with the software I had before and did all tasks needed but Microsoft fixed that in each release / win -10 .. just by adding or removing something in .  aty least Linux work great and its just about all free . so your not out and feeling bad when things as this occur and your money is still in the bank , not there bank  with them laughing  

see I see it as they figured out how to keep the money rooling in to them   . you now got to keep buying new / latest for anything to work   no longer  keeping your solid hard working build up and going /; hanging on to it  also notice how older parts like say the NVidia  900 series cards are 100% not on the market  [maybe find a old stock left over  at best . see that's how its going to be  like intel try to get a  haswell or z97 board  100 % pull them and disappear  and your forced in to the latest  everything just over a board or chip replacement  .
amyway today count on long term and add future upgrades  there doing away with that  faster the ever now  any way possible  pulled from market / stop support , ect...... 

i'll guarantee you Microsoft is chewing at the bit to ditch everything under 10   and it as a cloud service  not a 'true'' OS   it will be jusrt so easy for them to do  .    I use to not mind spending my money on this stuff but now  I just don't want to  .

this stuff seems trending to the newer  X-box baby sitter generation  guys  .


----------



## jabbadap (Apr 30, 2018)

bug said:


> By that logic, it may be worth mentioning it doesn't have composite, S-Video or SCART outputs either



The thing is for multimonitor users there might be that n:th monitor, which requires to be plugged on VGA and VGA was common PC monitor connector in past. I don't remember seeing any of those telly connectors on any PC monitor, but most of them have VGA. Well maybe some really old CRT:s with RGB Coaxial banana plugs.

But yeah VGA should have died a painful death a long ago.


----------



## Kissamies (Jul 10, 2018)

coonbro said:


> I think this is the only review site that  points this out  in thumbs down
> 
> ''No analog (VGA) support''
> 
> ...


Removing the analog signal should be a pro instead of con, should have been since Radeon R9 290 series. Yeah yeah no lag etc, but who the hell keeps those fish bowls on their desks anymore? If someone has money to buy a fast graphics card, then he/she probably has money to buy a suitable monitor.

I play games with 1080p @ 74Hz and no problems here.


----------



## newtekie1 (Jul 10, 2018)

Chloe Price said:


> Removing the analog signal should be a pro instead of con, should have been since Radeon R9 290 series. Yeah yeah no lag etc, but who the hell keeps those fish bowls on their desks anymore? If someone has money to buy a fast graphics card, then he/she probably has money to buy a suitable monitor.



While VGA isn't usually used for the primary display, there are still a lot of poeple(myself included) that use it on a cheap secondary flatscreen.  Why throw out a good LCD just because it is VGA only?  Hook it up as  a second monitor, it's really convenient.


----------



## bug (Jul 10, 2018)

newtekie1 said:


> While VGA isn't usually used for the primary display, there are still a lot of poeple(myself included) that use it on a cheap secondary flatscreen.  Why throw out a good LCD just because it is VGA only?  Hook it up as  a second monitor, it's really convenient.


At the same time, why burden modern devices with legacy hardware, when those that want to keep using old monitors can buy an adapter themselves instead?
The VGA output means one more SKU for the manufacturer and at least an associated round of tests specifically for that.


----------



## Kissamies (Jul 10, 2018)

newtekie1 said:


> While VGA isn't usually used for the primary display, there are still a lot of poeple(myself included) that use it on a cheap secondary flatscreen.  Why throw out a good LCD just because it is VGA only?  Hook it up as  a second monitor, it's really convenient.


The image quality with VGA on a TFT monitor is just horrible when compared to a digital output. And if that's some old monitor with VGA only, I'd just trash it or store it in a closet or something.

What I mean, is that why new hardware should support legacy technology from year to another? Motherboards have dropped PCI slots over the years, and that's also a good thing.


----------



## coonbro (Jul 10, 2018)

why not just make any new part / hardware incompatible with any old ?   screw upgrading any one part and just force you to do a whole new build each and every time ?   

what ''modern monitor beats a  CRT in gaming yet today  ?   eyecandy is about it   thing with a crt  is its dead nutts on and no lag   your latest lcd cant say that 

how hard and at what super extra cost is a supported dvi-I for analog support over them  putting a near useless dvi-d  on a card ?   then think my old  900 series  supports vga / analog  xp vista 7 Linux .  that new 10 series cant and coast more ..   go figure   I figure if there dropping support why am I paying more for a card that cant do as much as the older card I run now ?

see how they figure how to limit your upgrading and  like a lot found ''upgrading'  to the latest card  to find there buying more stuff to support it   ? so now a guy buys a 200 -400 buck card to find hes not supported in some was like OS or his older working fine and dandy monitor  and hes now spending another  100- 200 on a OS or a new monitor t0 support his new 200-400 buck card ...

so now he had to ditch a great working monitor to support his new card cause it don't support all his needs 

see today  they figure its a lot easyer and cheaper on them for you to now support them instead of them supporting you and your needs  

computer building  has become a joke and a sham  and all about how to get you to spend more faster  and the state of the rest today I don't see doing it anymore  after 16 years of building  nice gaming rigs , its near a joke anymore then add that gay LED lighting on everything too boot .     [opinion]


----------



## newtekie1 (Jul 10, 2018)

bug said:


> At the same time, why burden modern devices with legacy hardware, when those that want to keep using old monitors can buy an adapter themselves instead?
> The VGA output means one more SKU for the manufacturer and at least an associated round of tests specifically for that.



Oh I agree, I'm not saying the VGA output should stick around.  Especially when you can get an HDMI to VGA adapter for under $15.  I have a couple of those on my tech bench, just so I can keep using my VGA KVM with new computers that have no VGA output(have you priced DVI/HDMI KVMs!).

My point was that not including a VGA output is in fact a negative. It is a minor negative, and I believe W1z has said several times that it does not affect the score, he just has to include it to let people know the card won't output a VGA signal.



Chloe Price said:


> The image quality with VGA on a TFT monitor is just horrible when compared to a digital output. And if that's some old monitor with VGA only, I'd just trash it or store it in a closet or something.



The quality is not bad.  Really, people need to get over the idea that VGA meant bad quality, it just isn't true.  I've got three 1080p monitors connected to my work PC right now, one with HDMI, one with DVI, and one with VGA, and you would not be able to pick out the VGA one only going by image quality.

And why waste a monitor by just putting it in storage, I'd much rather be using it.  Even if it is just used as a secondary monitor, I'd rather it be used than sit wasted.



Chloe Price said:


> What I mean, is that why new hardware should support legacy technology from year to another? Motherboards have dropped PCI slots over the years, and that's also a good thing.



It is always a trade-off that the manufacturers have to figure out.  In the case of PCI, that slot takes up space on a motherboard that other more modern slots could be using.  So, removing it was a benefit for the consumer.  However, VGA is different.  It doesn't take up any output space on any card that already has DVI, because the VGA can be integrated into the DVI port.  So removing VGA output can only be a negative for the consumer, and these reviews are written for the consumer.  So no VGA output is listed as a Con.  It is a minor con, but a con none the less.


----------



## Kissamies (Jul 11, 2018)

newtekie1 said:


> The quality is not bad.  Really, people need to get over the idea that VGA meant bad quality, it just isn't true.  I've got three 1080p monitors connected to my work PC right now, one with HDMI, one with DVI, and one with VGA, and you would not be able to pick out the VGA one only going by image quality.
> 
> And why waste a monitor by just putting it in storage, I'd much rather be using it.  Even if it is just used as a secondary monitor, I'd rather it be used than sit wasted.


Weird, the monitor which I have now as my secondary monitor, bundled with a only a VGA cable, and when used with 1080p, it had some noise in picture. Vibrance in black etc., DVI fixed all that.

Still my opinion is that an over 30 years old connector is simply obsolete, and should be just let die peacefully instead of keeping it at life support. If there's a need to use some old monitor without a digital input, why just not use it with the integrated GPU* or just get a cheap graphics card to get an output for it?

*if the motherboard has an analog output, or if an iGPU is even existent. Like my X99 platform doesn't have one.


----------



## newtekie1 (Jul 11, 2018)

Chloe Price said:


> Weird, the monitor which I have now as my secondary monitor, bundled with a only a VGA cable, and when used with 1080p, it had some noise in picture. Vibrance in black etc., DVI fixed all that.



Get a better VGA cable.



Chloe Price said:


> Still my opinion is that an over 30 years old connector is simply obsolete, and should be just let die peacefully instead of keeping it at life support. If there's a need to use some old monitor without a digital input, why just not use it with the integrated GPU* or just get a cheap graphics card to get an output for it?
> 
> *if the motherboard has an analog output, or if an iGPU is even existent. Like my X99 platform doesn't have one.



Yep, I agree, but still a con that needs to be listed for the people that still use VGA.


----------



## bug (Jul 11, 2018)

newtekie1 said:


> Yep, I agree, but still a con that needs to be listed for the people that still use VGA.


In 2018, it should simply be a default.


----------



## coonbro (Jul 11, 2018)

techpowerup has that under thumbs down   [or there reviews of  the 10 series I looked at here did ]      

its in this cards review as well  

thumbs down 

''No analog (VGA) support ''  

SAD thig is most folke only want to look at misleading  FPS and how much LED lighting  a card has  .     then buy, try ,and maybe cry,,  then shell out more money to support there new hardware cause it don't support them in some fashion  or way  . booo hooo


----------



## newtekie1 (Jul 11, 2018)

bug said:


> In 2018, it should simply be a default.



Removing an option from the consumer that doesn't benefit the consumer in any way is never anything other than a con.


----------



## bug (Jul 11, 2018)

newtekie1 said:


> Removing an option from the consumer that doesn't benefit the consumer in any way is never anything other than a con.


Maybe, but removing the need for analog circuitry does benefit the customer: it makes for a slightly cheaper board and instead of a connector that maybe 5% of the users need, you get a HDMI or maybe 2 miniDP connectors instead. They could also leave that space empty for better air flow, but I haven't seen anyone doing that.


----------

