• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Zotac GeForce GTX 275 Amp! Edition

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,650 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Today NVIDIA released their GeForce GTX 275 Series. While there are no architectural improvements, the new cards offer substantially improved performance at competitive prices. Zotac's Amp! Edition offers higher clocks out of the box - is that enough to combat ATI's new offerings?

Show full review
 
Last edited:
First GTX 275 review I've seen on the net! :toast:
 
And another great review by the W1zzard!
I like the overclocking potential of this card :laugh:
 
Yep, great review Wizz. And amazing OC!

There's one thing though. Power consumption must be wrong on this sample card. I mean no way the GTX275 consumes much more than the GTX285, having less ROPs and running a tad slower. And even consumes more than the GTX295... NO WAY!

I guess that vanilla numbers were taken with the Zotac card too, underclocked to stock clocks right?? This (and I mean this single one sample) Zotac card must have something wrong IMHO. The high temperatures might also be related to those abnormal power consumption figures I guess? They are too high anyway.

Performance wise it seems Nvidia won this round in any case, in overclocking too (where's the easy to reach 1GHz in RV790??). I wonder how are they going to justify the GTX285 in the future though. They released this card to compete with the HD4890 and not have to lower GTX285 prices and they did won against it, but they'll have to lower the price a lot nonetheless to "compete" with their own card. Not that I didn't know it would perform almost like the GTX285. If it wasn't for the clock difference... :laugh:
 
I did say this would be like the 8800GT of the last series and I was right.
 
looks awesome but why does it out do the GTX 285 ?? i know its an overclock but what the hell
 
looks awesome but why does it out do the GTX 285 ?? i know its an overclock but what the hell

Most lower models, especially NVidia's will overclock beyond stock higher models performance, for example, the GTX260 either 192 or 216SP both overclock beyond a GTX280's performance at stock clocks.

Fantastic review, i found another for those interested in a direct comparison with the HD4890 reference and overclocked versions, not nearly as thorough as Wiz's review but thought I would throw it in.....

http://www.hexus.net/content/item.php?item=17863&page=1
 
Still the GTX 285 ought to oc more, which would have it running higher than the 275. The fact that Nvidia has a card that outperforms its higher card is simply awesome.
 
they didnt know what to expect from the 4890 so they released the fastest, cheapest card they could.
 
hey Wiz I thought the rop's were tied to memory bit on this architecture. Wouldn't that put the gtx275 at 28 rops based on the 448-bit memory? your chart has it at 32 rop's the same as the gtx285 whcih has a 512-bit mem interface.
 
hey Wiz I thought the rop's were tied to memory bit on this architecture. Wouldn't that put the gtx275 at 28 rops based on the 448-bit memory? your chart has it at 32 rop's the same as the gtx285 whcih has a 512-bit mem interface.

Yes your correct.... they are 28 ROP's :D
 
thanks, fixed
 
If I can get a refund for my GTX260 I'm getting one of these.
 
I believe in the table the listing for the Reference 275 has 32ROPS as well which needs correction

Yep, those are on the HD 4890 tables too.

I knew that it would beat my GTX 280, but getting so close to GTX 285 was a surprise. IMO they shot themselves in the foot, killing pretty must most of GTX 285 sales. 55nm version of GTX 280 would have been spot on with performance to HD 4890, but of course it would have had massive OC ability too, so dunno.

One notch slower shader clock to 1350 instead of 1404 would have worked too.
 
Wow. Noisey. Like a hairdryer. Shame.

Wiz... PLEASE keep an "old skool" card in there with your review. Remember people who read reviews are often (and for advertising reasons) the ones who want to upgrade NOW and are looking at their options. So the question is... what are they upgrading FROM. So just one "3" series ATI and one "7" series nV will help ground the data for these people. (INCLUDING ME :) )
 
Wow. Noisey. Like a hairdryer. Shame.

Wiz... PLEASE keep an "old skool" card in there with your review. Remember people who read reviews are often (and for advertising reasons) the ones who want to upgrade NOW and are looking at their options. So the question is... what are they upgrading FROM. So just one "3" series ATI and one "7" series nV will help ground the data for these people. (INCLUDING ME :) )

Yeah, I thought the same, but that's only this sample apparently. Looked at many other reviews and that's not the case with any other 275's around the net. It's a shame, but I think that Wizz got the broken card, because of the hurry. :(
 
These cards are kind of cool. I'm sad that they skimped on the voltage control :( 295s have voltage control, come on nV!
 
well if they allowed voltage control the GTX 275 would probly best the GTX 285 even more easily. I agree nvidia definitely shot the GTX 285 sales with this release, were they really that unsure of what HD 4890 would bring? lol i couldn't see how
 
So I noticed on load, the fan noise is the loudest of any other card. Was this set to 100% or something? What were your impressions with the noise? I mean, is it really obnoxious?

When I got my 8800gt, the fan didn't kick in and make noise until it got to 100C or so. Is it possible they lowered the "kick in" temperature which is why load results in higher sound than the other cards?
 
My next card.....just need to find one that performs as well but quieter :)
 
well if they allowed voltage control the GTX 275 would probly best the GTX 285 even more easily. I agree nvidia definitely shot the GTX 285 sales with this release, were they really that unsure of what HD 4890 would bring? lol i couldn't see how

The problem is that they couldn't do anything else. Nothing better at least. They knew HD4890 was going to be faster than GTX260 and slower than GTX285 and that's by itself a problem, because most people look for the second fastest thing. <- Because of how prices work.

A redesign to fit in the same performance gap is a waste of money IMO, so they just took the GTX260 240 approach. It's cheap because probably atm the yields for that chips are as high as the ones for the GTX216, that is they get as many 28ROP+240SP chips as 28+216 ones, so it makes sense. There's no point in using those chips in the lesser card GTX260 (disabling one working TPC, 24SPs) and GTX295 demand isn't all that high compared to mid-range/performance cards. For them both are almost the same thing, a GTX280/285 with some broken parts so the most units they can exploit the better.
 
So I noticed on load, the fan noise is the loudest of any other card. Was this set to 100% or something? What were your impressions with the noise? I mean, is it really obnoxious?

When I got my 8800gt, the fan didn't kick in and make noise until it got to 100C or so. Is it possible they lowered the "kick in" temperature which is why load results in higher sound than the other cards?

The cards try to stay in 80C, cooler is pretty much 40% (minimum) before that temperature and ramps ups as high as it goes, if the temperature doesn't go lower.

Those 55nm have a skimped out cooler, compared to 65nm GTX 260/280. Less pipes and less of the fins. 55nm GTX 260 already runs hotter than 65nm counter part and with this having more shaders (maybe more voltage) and a massive OC the cooler just can't cope with the heat in less RPM.

I'd think the fan is exact same still than in my stock cooler and with 100% it's something that you don't really want to listen. It's much lower and gentle wooosh than 9800GTX+ @ 100% how ever.

Get one of these and Accelero Xtreme and you got your self one sweet card :)
 
Wow you were right. nVidia wanted to disrupt the 4890 launch and they absolutely did. If it weren't so loud (meaning I'd have to buy an aftermarket cooler) I would be tempted to pick one up in place of my GTX260 :D
 
Back
Top