Thursday, February 17th 2011

NVIDIA Readying GeForce GTX 550 Ti for March

NVIDIA is reportedly working on a new mainstream GPU for a mid-March launch. The graphics giant is planning to name it GeForce GTX 550 Ti, that's right, "GTX" prefix and "Ti" suffix monikers being extended to the core mainstream, in place of the typical "GTS". The new GPU will be based on the 40 nanometer GF116 silicon. Its exact specifications are not known, except that the GPU will use a 128-bit wide GDDR5 memory interface, and will be pin-compatible with GF106, on which is based the GeForce GTS 450. Its performance is estimated to be up to 35% higher than ATI Radeon HD 5770. Due to an electrical redesign over GF106, the GTX 550 Ti is expected to have a TDP of just 110W, nearly the same as that of the GTS 450, except having higher performance. The NVIDIA GeForce GTX 550 Ti is slated for release on 15 March, 2011.
Source: VR-Zone
Add your own comment

50 Comments on NVIDIA Readying GeForce GTX 550 Ti for March

#26
RONX GT
btarunrThe graphics giant is planning to name it GeForce GTX 560 Ti
Isn't it GTX 550 Ti?;)

I do hope NV will launch this product at reasonable price. If it match the GTX 460 it'll be Awesome..till then waiting for a review:)
Posted on Reply
#27
KashunatoR
who cares about this one? we need info about gtx 590 :D
Posted on Reply
#28
RadeonProVega
Alright, how come people don't want to mix things up a bit, meaning how come this 550 isn't a 265bit video card, and packing 2gb of GDDR5 memory? 1GB is fine, but they don't have to keep using 128bit, use 265bit.

This card sounds like a updated 450 thats all.

HOld up , it is to be on the same level as a 460?
Guess i will be buying this card , its suppose to have 1 6 pin right?
Posted on Reply
#29
Red_Machine
_JP_*Wonders when will nVidia will bring back other monikers...like GeForce GTX 590 Ultra or GeForce GTS 540 MX*
I expect the GTX 590 to have the Ultra suffix considering the way nVidia's going. Though I would welcome a return of the MX monkier, I would still like it to be on some sort of seperate level to the rest of the series, as the MXes were in the past. Perhaps an overhauled G92 on 28nm or something...
Posted on Reply
#30
slyfox2151
u2konlineAlright, how come people don't want to mix things up a bit, meaning how come this 550 isn't a 265bit video card, and packing 2gb of GDDR5 memory? 1GB is fine, but they don't have to keep using 128bit, use 265bit.

This card sounds like a updated 450 thats all.

HOld up , it is to be on the same level as a 460?
Guess i will be buying this card , its suppose to have 1 6 pin right?
LOL........ WTF is this card going to use 2GB of ram on exacly? hell imo it could get away with less then 896MB of vram.



P.S glad to see you using cards with Power Adaptors :D you have come along way from PCI cards lol.
Posted on Reply
#31
cdawall
where the hell are my stars
u2konlineAlright, how come people don't want to mix things up a bit, meaning how come this 550 isn't a 265bit video card, and packing 2gb of GDDR5 memory? 1GB is fine, but they don't have to keep using 128bit, use 265bit.

This card sounds like a updated 450 thats all.

HOld up , it is to be on the same level as a 460?
Guess i will be buying this card , its suppose to have 1 6 pin right?
128bit gddr5 has the same bandwidth as 256bit gddr3 how much more bandwidth does a midrange card need?
Posted on Reply
#32
GSquadron
SinziaI wonder if this will ever come out in a single slot version, it may be great for a folding rig!
They made GTX 460, so it is going to be a single slot version for sure!
And i think about the bandwidth like this
Low end cards = 128 bit
Mid = 256
High = 320 or more
Posted on Reply
#33
Benetanegia
cdawall128bit gddr5 has the same bandwidth as 256bit gddr3 how much more bandwidth does a midrange card need?
It really depends, don't you think? I agree with the general idea of your comment, but this card is supposed to perform on par with the GTX460 or so, which is 256 bit and not much higher in the stack we have the HD6850 which desperately needed a 256 bit interface too. The 192 bit 460 is not much slower than the 256 bit one, but it IS slower and if this card is supposed to perform in the neighborhood of these cards (that's their claim) it could definately use more bandwidth than what 128 bit interface will offer. They will probably use much higher clocked memory tho.

Now, I'm always of the kind that thinks that bandwidth is not as important and thus I think that 128 bit for this kind of card is a good compromise, but it's going to be a limiting factor anyways. It all comes down to the fact that we (enthusiasts) tend to see mid-range cards as "just good enough" cards, and as such "good enough" specs are due, especially bandwidth. People who usually buy mid-range think differently though: they want the best they can get and want to see as few corners cut as posible. Of course avoiding cutting corners is far more difficult on lower segments if you want to keep them cheap, but... it's human nature to want more!

So while I don't completely agree with their logic, I can't refute it either. That's the bottom line of this long (and probably worthless) post.
Posted on Reply
#34
cdawall
where the hell are my stars
Aleksander DishnicaThey made GTX 460, so it is going to be a single slot version for sure!
And i think about the bandwidth like this
Low end cards = 128 bit
Mid = 256
High = 320 or more
Well that's not the same anymore low end cards often carry gddr3 128bit which is half the bandwidth of gddr5 128bit. That's the reason the 5770/gts450/gtx550ti and other cards have been using 128 bit memory.
BenetanegiaIt really depends, don't you think? I agree with the general idea of your comment, but this card is supposed to perform on par with the GTX460 or so, which is 256 bit and not much higher in the stack we have the HD6850 which desperately needed a 256 bit interface too. The 192 bit 460 is not much slower than the 256 bit one, but it IS slower and if this card is supposed to perform in the neighborhood of these cards (that's their claim) it could definately use more bandwidth than what 128 bit interface will offer. They will probably use much higher clocked memory tho.

Now, I'm always of the kind that thinks that bandwidth is not as important and thus I think that 128 bit for this kind of card is a good compromise, but it's going to be a limiting factor anyways. It all comes down to the fact that we (enthusiasts) tend to see mid-range cards as "just good enough" cards, and as such "good enough" specs are due, especially bandwidth. People who usually buy mid-range think differently though: they want the best they can get and want to see as few corners cut as posible. Of course avoiding cutting corners is far more difficult on lower segments if you want to keep them cheap, but... it's human nature to want more!

So while I don't completely agree with their logic, I can't refute it either. That's the bottom line of this long (and probably worthless) post.
I get what your saying I have used midrange before but that stopped because they do cut corners. This particular card 128bit is doubtful to limit it. More likely lack of shaders will.
Posted on Reply
#35
HalfAHertz
And you also have to consider that not only is gddr5 quad pumped, it's also running at a faster frequency than gddr3, so in fact 128bit gddr5>256bit gddr3 bandwidth wise.
Posted on Reply
#36
wolf
Better Than Native
still want to see a GF116 core use 192 bit and 24 ROPS, hec Nvidia have proven they can take GF1xx chips and secret sauce them up for clockspeeds and yeilds, I don't see an issue with yet an even smaller chip.
Posted on Reply
#37
Benetanegia
cdawallI get what your saying I have used midrange before but that stopped because they do cut corners. This particular card 128bit is doubtful to limit it. More likely lack of shaders will.
I'm just assuming the card will have a higher shader count. There's absolutely no other way to make it 35% faster than HD5770, which also means 40-50% faster than the GTS450. Not any increase in clocks will do that, unless it's 1200++ Mhz on the core... (hmmm no) So that's why I'm assuming that the card is going to have 256 SP (+33%) and a little bump to the clocks. That's why I think it will be limited by the 128 bit MC, a 192 SP part wouldn't be. BTW I'm saying limited and not bottlenecked, take that into account.
Posted on Reply
#38
cheesy999
BenetanegiaI'm just assuming the card will have a higher shader count. There's absolutely no other way to make it 35% faster than HD5770, which also means 40-50% faster than the GTS450. Not any increase in clocks will do that, unless it's 1200 Mhz on the core... (hmmm no) So that's why I'm assuming that the card is going to have 256 SP ( 33%) and a little bump to the clocks. That's why I think it will be limited by the 128 bit MC, a 192 SP part wouldn't be. BTW I'm saying limited and not bottlenecked, take that into account.
even if it was bottlenecked to the average consumer its just the total performance that matters so if it did have 256 sp wouldn't that just mean that its more cost effective for nvidia to just add 33% more sp then it is for them to double the memory bandwith
Posted on Reply
#39
GSquadron
Ayway, is GTX550 going to be better than GTX 460? And if yes, which version of 460?
Posted on Reply
#40
cheesy999
Aleksander DishnicaAyway, is GTX550 going to be better than GTX 460? And if yes, which version of 460?
well Gtx 450 is sopposed to be 35% better then a 5770 where a Gtx460 1gb comes out at around 25-45% faster depending on the game, so it might end up being 2%-3% below the gtx 460 768



Posted on Reply
#41
Benetanegia
cheesy999even if it was bottlenecked to the average consumer its just the total performance that matters so if it did have 256 sp wouldn't that just mean that its more cost effective for nvidia to just add 33% more sp then it is for them to double the memory bandwith
Yes of course. On average a SP increase would yield a higher performance increase without a doubt, but on some scenarios more ROP/BW have a bigger impact. For example if you game at high resolution and you prefer a) medium ingame settings + 4xAA" instead of b) high settings + 0/2xAA. I like b) better but I'm just saying how some people might rather take the first option.

At any rate, I'm not saying the card should have more ROP/bandwidth, in fact I am a huge supporter of increasing SP to ROP ratio in most of Nvidia cards*. But at the same time I can only say that such a card (128 bit, 256 SP) would definately benefit from a wider bus. Is that economical or does even make any sense? Probably not. Most definately not, but that does not change the fact that for some scenarios and some people it would make more sense, and there's nothing wrong in them asking for what they want, asking is free after all.

*I even made a chart comparing all the specs of every Fermi card in percentages and comparing them to actual performance from Wizz's reviews and it turns out that for most of the Fermi cards released to date the SP power is linearly proportional to the obtained performance with a variation of +/- 2%. The exception is the GTX560 which performs some 5% lower than what I "predicted", and that suggests a small memory/rop limitation.

EDIT: btw is there something wrong with me or are the 2 charts in your last post identical?
Posted on Reply
#42
GSquadron
So that means that it is going to be cheaper than GTX 460 768!
That is good news and i don't want to say that GTX460 is going to be priced even lower :D
Posted on Reply
#43
cheesy999
Aleksander DishnicaSo that means that it is going to be cheaper than GTX 460 768!
That is good news and i don't want to say that GTX460 is going to be priced even lower
well the card its replacing cost £90-100 so this new one will probably cost £95-£105 or something similer
Posted on Reply
#44
RadeonProVega
So this card is only going to have one 6 pin connector and need a 450 W PSU?
slyfox2151P.S glad to see you using cards with Power Adaptors :D you have come along way from PCI cards lol.
Yea , i am kinda with the times now. Plan on picking up a 6850 in about 2 weeks to go along with a new computer i am buying to replace my Intel Celeron e3330.
Posted on Reply
#45
cdawall
where the hell are my stars
u2konlineSo this card is only going to have one 6 pin connector and need a 450 W PSU?


Yea , i am kinda with the times now. Plan on picking up a 6850 in about 2 weeks to go along with a new computer i am buying to replace my Intel Celeron e3330.
im running a GTX470 with a 450w PSU lol these things are majorly overblown :laugh:
Posted on Reply
#46
RadeonProVega
Well i guess my next 2 cards to pick up is a 6850 and a 550, can't wait :)
Posted on Reply
#47
HalfAHertz
The general idea is that you want to load your PSU at 50-60% because that's where it has the highest efficiency. So that means that your GPU+CPU+HDDs+RAM+MB's average load should be your PSU's max load/2
Posted on Reply
#48
Benetanegia
HalfAHertzThe general idea is that you want to load your PSU at 50-60% because that's where it has the highest efficiency. So that means that your GPU+CPU+HDDs+RAM+MB's average load should be your PSU's max load/2
I don't agree. It does have its highest efficiency on that range but on modern quality PSUs the efficiency difference between say 50% load and 80% load hardly ever exceeds a 2% (and it's often times only <1%). That 2%, for a 500W average power consumption, which is a lot (I mean A LOT), means a 10W difference, which translates to about 4 cents after 24h of full operation assuming you pay 20 cents per kWh which is again a lot. At the end of the year that would mean a massive (sarcasm) $5-10 depending on how much you use your PC. And that's assuming 500W average load and almost 24/7 operation. A more realistic figure for almost everyone is $2-3 savings per year.

So, you might want to have your PSU loaded at 50% for many reasons (reliability, noise, heat...) but efficiency really shouldn't be your reason, you will never recover the extra $$ you paid for the higher wattage PSU. Never.
Posted on Reply
#49
micropage7
Aleksander DishnicaSo that means that it is going to be cheaper than GTX 460 768!
That is good news and i don't want to say that GTX460 is going to be priced even lower :D
omg. i just have 460 just about 3months. it looks i better be patient a little until the price goes friendly to my wallet
Posted on Reply
#50
GSquadron
Man, if i had the money i would have bought it since the day it launched, so be optimistic ;)
Posted on Reply
Add your own comment
Nov 21st, 2024 13:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts