• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

I suppose it keeps the fan spinning.:ohwell:
 
Ok, so even if we assume the second card uses 0w when idle(I doubt it will be that low, but lets assume).

The power savings is only $31 over the course of a year. Again, nothing that anyone buying these cards is going to be concerned with.

LOL Pays for toilet paper for a month! :laugh:
 
Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

hey, i didn't say it consume 0(zero) watt, i'm just saying the second card use less power(like hibernating state), and i'm just quoted what tom'shardware say's, and still GTX 295 consume 29 watt more than CF HD 5870.

and beside that tom'shardware know what they are doing, they are professional bencher after all, and know more about technical design than us
 
hey, i didn't say it consume 0(zero) watt, i'm just saying the second card use less power(like hibernating state), and i'm just quoted what tom'shardware say's, and still GTX 295 consume 29 watt more than CF HD 5870.

and beside that tom'shardware know what they are doing, they are professional bencher after all, and know more about technical design than us


We are comparing GPU's from two different realms though. Nvidias offering has not showed up yet. Its a matter of days after the ATi's release. Nvidias current offering that is out, is older tech.
 
but, the faq is NVDIA doesn't show anything yet, no early benches, no performance graph, not even a leaked photo from random Chinese website

so how we do compare HD 5870? are u saying we just use our imagination to compare it, with rumored GT300 spec?
 
but, the faq is NVDIA doesn't show anything yet, no early benches, no performance graph, not even a leaked photo from random Chinese website

so how we do compare HD 5870? are u saying we just use our imagination to compare it, with rumored GT300 spec?

It would be to have a longer patience than under a week. :laugh:
 
I just found this,is it real or fake?
http://translate.google.com/translate?hl=en&sl=auto&tl=en&u=http://news.mydrivers.com/1/143/143284.htm&prev=hp&rurl=translate.google.com

Heres another asian site with the same pic on it
http://cd.qq.com/a/20090831/000124.htm
04151351.jpg
 
Last edited by a moderator:
Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

Very nice catch, I didn't even notice that. If you consider an 80% PSU efficiency, thats about 20w...

So is AMD lying here? Seems the second card doesn't use any less power than the first.

but, the faq is NVDIA doesn't show anything yet, no early benches, no performance graph, not even a leaked photo from random Chinese website

so how we do compare HD 5870? are u saying we just use our imagination to compare it, with rumored GT300 spec?

And how long before HD5870 did we see anything more than rumor? When did the first pictures appear? A week before launch? When did we get solid specs that wasn't rumors? A week before launch?

Comparing it to the current tech on the market is one thing, but assuming that nVidia won't release anything else and running out into the streets claiming an ATi victory is stupid and ignorant, but that is what most are doing with the HD5870.

It beat last generation tech! Lets have a parade!

The problem is, IMO, beating the previous generation's tech isn't good enough. It needs to be able to compete with the current generation tech, and this has been ATi's failure since RV600, and the reason the consumer has to pay outragous prices for high end tech.
 
could we stop bashing/praising ati or nvidia, all this fanboy talk get tiring

so let hear what this gt3xx could do
 
Very nice catch, I didn't even notice that. If you consider an 80% PSU efficiency, thats about 20w...

So is AMD lying here? Seems the second card doesn't use any less power than the first.



And how long before HD5870 did we see anything more than rumor? When did the first pictures appear? A week before launch? When did we get solid specs that wasn't rumors? A week before launch?

Comparing it to the current tech on the market is one thing, but assuming that nVidia won't release anything else and running out into the streets claiming an ATi victory is stupid and ignorant, but that is what most are doing with the HD5870.

It beat last generation tech! Lets have a parade!

The problem is, IMO, beating the previous generation's tech isn't good enough. It needs to be able to compete with the current generation tech, and this has been ATi's failure since RV600, and the reason the consumer has to pay outragous prices for high end tech.



gingerbread-house-parade-float1.jpg
 
inresponse.jpg


I think the 5870 is currently the best offering for the money in a single gpu. But we do not know for how long.

Give the green team time to show the cards. Who knows what they have to offer. Both have made solid offerings in the last generation so assuming Nvidia won't this time too would be silly. Competition saves us money and helps push technology forward.
 

That's very interesting and weird at the same time. We can take some info from that picture and we could take more if it wasn¡t so confusing. Let me explain, according to this picture of GT200:

gt200b.jpg


We can relate some shapes with the actual units like this:

gt200a.jpg

gt300a.jpg


1- Shader Processor SIMD cluster. In G92 it had 16 SP (2x8), in GT200 it had 24 (8x3) and in GT300 it should have 32 (4x8). In GT300 we can see 16 of such clusters and that would ammount to 32x16= 512 SP. Everything seems OK.

2- Texture Units. In G80, G92 and GT200 always next to the SPs. Consists of 8 texture processors. In GT200 8x10=80. In GT300 we see 8x16=128 (ok according to leaked specs). But is that correct? We can see many other units with the same appearance all around (marked with dark green) in the middle and to the left end of the chip.

3- Raster Units. The yellow rectangle should contain 8 of them in GT200 (4x8=32), same for GT300 I'd suppose. But that way we would be talking about 10 clusters or 80 ROPs. That unless some of those are not ROPs and are something else, because all of them have a unit similar to a Texture Cluster next to them.

4- Memory. Caches and registers. In GT300 there are many areas between the different units that could be caches too, GT300 is much less organized than GT200, probably to save space.

5- That must be the setup engine and thread dispatch processor. While we see two diferrent recognizable shapes in GT200, in GT300 we can see 5.

So what do you guys think those units that I marked in blue and dark green are? IMO the units that look like what's rounded in blue are the same thing, and they are very similar to the ROPs in GT200, but they are next to what looks like Texture units (dark green), and there are too many of them too. Share your thoughts.
 
Let's just take a look at the picture, and ignore all the other leaked info...

04151351141.jpg


There are 5 yellow boxes, 5 blue boxes, 3 white boxes, 3 black (center, ignore the top and bottom black) boxes.

But, if you look at the yellow lines that i drew, every yellow box has a set of white, blue, and black boxes.
I don't know much about technical stuffs, but if you just look at the picture this way, isn't it make sense?
 
Last edited:
kid you've got a good eye for patterns.
 
Codename: FERMI

Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.
 
Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.

I think it will most likely repeat what happened last generation. Nvidia creating a more powerful GPU at a higher price point. Nvidia likes to be king of the hill, but performance per dollar wins over the most of us. Therefore, ATi will easily be more competitive on the price side.

The only reason I buy Nvidia products more is because I always have driver problems using ATi cards. Like the flickering even after calibrating my monitor. It might just be my luck or simply having a more sensitive optic nerve than most? I had those problems with 4-series ATi and not 3-series or before.
 
I think it will most likely repeat what happened last generation. Nvidia creating a more powerful GPU at a higher price point. Nvidia likes to be king of the hill, but performance per dollar wins over the most of us. Therefore, ATi will easily be more competitive on the price side.

I'll +1 that. This is exactly the situation with the 4870 & 4890 cards.
 
The only reason I buy Nvidia products more is because I always have driver problems using ATi cards. Like the flickering even after calibrating my monitor. It might just be my luck or simply having a more sensitive optic nerve than most? I had those problems with 4-series ATi and not 3-series or before.

I don't understand about the flickering. The amount you see is dependent on the refresh rate (on a CRT monitor only), your eyes and the lighting environment. 60Hz will look horrible and 85Hz is generally unnoticeable. 100Hz is the daddy, but running that at decent resolutions with clarity is beyond most CRTs (those that are still around anyway). The cards blur out too at higher analog bandwidths.
 
I don't understand about the flickering. The amount you see is dependent on the refresh rate (on a CRT monitor only), your eyes and the lighting environment. 60Hz will look horrible and 85Hz is generally unnoticeable. 100Hz is the daddy, but running that at decent resolutions with clarity is beyond most CRTs (those that are still around anyway). The cards blur out too at higher analog bandwidths.

Monitor compatibility is not 100% with ATi to my knowledge. It doesn't have a problem with my high end graphics CRT's but my BenQ 1080p LCD and my 65" Samsung DLP have slight flickering in 3d gaming with a couple different 4850's I have had. The problem was apparent with all driver versions available at the time. My samsung LCD did fine too.

Same displays with numerous Nvidia offerings did fine though. Like I said it might have been bad luck so I am not saying everybody else encounters it because its obvious it doesn't present itself with many displays.


BTW they might have fixed it with the recent 5 series because I remember them saying some sort of cycle being shortened to fix the problem I was experiencing and making it unnoticeble. I read it in the recent tomshardware article on 5870's. I will link it when I have the time to find it again.
 
Monitor compatibility is not 100% with ATi to my knowledge. It doesn't have a problem with my high end graphics CRT's but my BenQ 1080p LCD and my 65" Samsung DLP have slight flickering in 3d gaming with a couple different 4850's I have had. The problem was apparent with all driver versions available at the time. My samsung LCD did fine too.

Same displays with numerous Nvidia offerings did fine though. Like I said it might have been bad luck so I am not saying everybody else encounters it because its obvious it doesn't present itself with many displays.


BTW they might have fixed it with the recent 5 series because I remember them saying some sort of cycle being shortened to fix the problem I was experiencing and making it unnoticeble. I read it in the recent tomshardware article on 5870's. I will link it when I have the time to find it again.

Sounds like driver bugs to me - I've seen this sort of thing at work a couple of times (desktop, no games of course). Turning the monitor off and on again cured it I think; I can't quite remember now. If ATI don't fix them, they deserve to lose customers.

The 5-series thing is the clock switching between 2D & 3D having been improved, to prevent a visible glitch artifact being visible, or making it less visible. The glitch was due to the use of GDDR5.
 
Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.

hmmm, the chip supports GDDR5, who would have thought that, now billions of transistors :banghead:, nice indeed

bigger?(card length and or chip?) and faster maybe and pretty likely since they always had the better shader architecture, or at least the more efficent ...

and are they telling me that the GX2 card is a complete new card and not just simply 2x GT300, if thats the case you will have to pay more than royal for a card like that

i'll wait and see since i like both teams in some way, nVidia for their performance and drivers, and ATi for their, mostly nice price-performance ratio
 
Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.

what the hell it's suppose to be, GT 200 already have billions of transistors, and it still bigger than HD 5870, and GT300 must be faster than HD 5870:rolleyes:, if nvdia still want to compete with ATI.

@DaedalusHelios : i think ATI HD 5870 use error correction and a powerful memory controller, and that's why now ati can lowering their GDDR5 frequency and still don't suffer from flickering or generating artifact (but it makes OC a bit more difficult tho)

i hope NVdia can bring GT300 ASAP, so i can compare the performance/performace, and decide witch card will i buy
 
Last edited by a moderator:
Last year's threads over the next gen gpus were so much more thought provoking... a lot of people would say they were sticking with the company that they've used forever, and others waited patiently and could make reason out of the reports they were recieving. It seems like people need a translator. Usually whisper down the lane of 4-5 people make the information a bit haggared, but come on! It's like directly from the news source people don't quite get the picture.

GT300 will be DX11, it has a high transistor count (NDA people), it uses GDDR5, and the architecture has changed significantly. The leaked photos of the chip seem to suggest something to that effect. There's also a number of sources saying nV is still on track for their Nov 27th release and may have some info leaked before to generate hype. While it's speculation at best they're still saying something and it's as much as anyone can say without breaking NDA.
 
That's very interesting and weird at the same time. We can take some info from that picture and we could take more if it wasn¡t so confusing. Let me explain, according to this picture of GT200:

http://img.techpowerup.org/090927/gt200b.jpg

We can relate some shapes with the actual units like this:

http://img.techpowerup.org/090927/gt200a.jpg
http://img.techpowerup.org/090927/gt300a.jpg

1- Shader Processor SIMD cluster. In G92 it had 16 SP (2x8), in GT200 it had 24 (8x3) and in GT300 it should have 32 (4x8). In GT300 we can see 16 of such clusters and that would ammount to 32x16= 512 SP. Everything seems OK.

Comparing those two die shots, the alleged GT300 die is holding 384 shaders, not 512. Look at those shader core blocks (red rectangle you marked on the GT200). There are 10 of those on the GT200. So 240/10 = 24. They bear a strong resemblance to the ones on the alleged GT300 die shot. There are 16 of those on the die. 16 x 24 = 384.
 
Status
Not open for further replies.
Back
Top