• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
looks damn tasty on paper... can wait to see it run tho. Also cant wait to see it OC :D.

WHERE ARE ALL THE LEAKED CHINESE BENCHES!?! *shakes fists at sky*
Not even the Chinese got a card yet :respect:
I won't be surprised that Q1 20101 means March....
Or it can be nVidia is doing a good job keeping it secret...
 
Last edited:
Not even the Chinese got a card yet :respect:
I won't be surprised that Q1 20101 means March....
Or it can be nVidia is doing a good job keeping it secret...

It probley does mean march, I know in the company i work for March is the start of Q1, jan is still Q4
 
Some GF100's in a maingear PC
imageview.php


http://www.guru3d.com/news/geforce-gtx-380-sli-photo/

That guy looks gigantic in relation to that mid tower lol
 
With 3-D interconnects, it can vertically connect two much smaller die. Graphics performance depends in part on the bandwidth for uploading from a buffer to a DRAM. "If we could put the DRAM on top of the GPU, that would be wonderful," Chen said. "Instead of by-32 or by-64 bandwidth, we could increase the bandwidth to more than a thousand and load the buffer in one shot."

Based on any defect density model, yield is a strong function of die size for a complicated manufacturing process, Chen said. A larger die normally yields much worse than the combined yield of two die with each at one-half of the large die size. "Assuming a 3-D die stacking process can yield reasonably well, the net yield and the associated cost can be a significant advantage," he said. "This is particularly true in the case of hybrid integration of different chips such as DRAM and logic, which are manufactured by very different processes."

http://www.semiconductor.net/article/print/438968-Nvidia_s_Chen_Calls_for_Zero_Via_Defects.php

:eek::eek::eek:

Wow!!! Even John Chen would like some more serious bandwidth, upwards of (1024x8 = 8192-bit bus!!!) :rockout: At least it's gonna ~50GB/s more than the GTX 285, which already has more than the 5870! Bring on those vertically stacked dies!

Also, the GT300 looks like it will be the first card to ever break the 200GB/s mark. That's a new milestone, after the 8800 Ultra was the first to reach the 100GB/s mark. 113% more shaders and 67% more TMU's over a GTX 285 aint too shabby, either!

What I'm a bit unsure about is will those 512 shaders actually be running at 1700MHz? I was expecting no more than 1600MHz, but wow.. seems that the 3rd spin on 40nm is finally yielding better results.
 
Last edited:
im ready to see NVs goods and all prices relative. i'm ready for the GTXXX Triple X B.A.M.F edition 3xx
 
Me want 3 GTX380's in tri SLI so i can play Ultimate DOOM :D
 
Fermi looks very promising .... cant wait to see what they have to offer ... :) ...
 
i find it really dramatic, how fast things can change, yet alone marketwise!:shadedshu
no concurrency, means... ATI/AMD will be the new Nvidia. :ohwell:
doesnt seem too wonderful, even to me...
 
Uhmm they looks good.

The plastic cooler design is similar to the single PCB 295.

I see a plastic cover, and second metal plate for cooling the RAM.


CHAOS KILLA

Seems like nvidia are really struggling with this!

Yes GTX 295 285 280 are out of stock since 3 months.
 
They have to order wafers with 3-5 months of anticipation so it's not really a surprise that there's such a shortage. They had a much better second half Q3 and Q4 than they expected, so now they have no cards to sell, and this goes to AMD too, just AMD ordered more than Nvidia, because Nvidia thought they would not continue selling many GTX2xx cards (logically) and they have been selling more than ever, which is ironic TBH.

Those specs for the GTX380 and GTX360 are impressive BTW. I've made an Excel calculating how different past Nvidia cards (from 9800GT to GTX285) performed in relation to their specs and they scale linearly with the Gflops except 2 cards (GTS250 and GTX275), which are ROP limited. In any case GTX3xxx cards have more of everything and to a greater extent than 9 series to 200 series. According to specs and assuming nothing goes extremely wrong, GTX380 could easily end up being faster than HD5970!!

Here is the excel chart (performance numbers from http://www.techpowerup.com/reviews/ASUS/EAH5970/31.html):

Nvidia_comparison.jpg
perfrel_1920.gif


The chart on top is just informative, you have to pay attention to the one below which has been normalized to show performance scaling. Look how performance scales linearly with the GFlops, except those 2 cards, that scale with pixel fillrate (the case of GTS250/9800GTX is very well known). That's why I assumed in the chart that GTX380 will perform between 325% (fillrate limited) and 518% (flops) compared to 9800gt, which would be translated as 95-150% in Wizzard's chart and 80-100 in case of the GTX360. HD5870 is at 71%...

According to these specs Fermi has to fail badly in order to not be much faster. Just WOW!!! We'll see.
 
Last edited:
No high end cards to sell.

Competition for ATI get better every day.

That is sad.... you think that Nvidia would have thought about this in advance. Think about it...... If you are a higher end Gamer and you can't get the top of the line AKA(5870/5970) and you need a new Video card. Then where's the next logical place to go...... the next best thing. Which would be last gen's top of the line. So of course they are selling out. Any one with half a brain could of saw this coming. Sad for Nvidia's partner's though!

i find it really dramatic, how fast things can change, yet alone marketwise!
no concurrency, means... ATI/AMD will be the new Nvidia.
doesnt seem too wonderful, even to me...

God I sure hope not. While I expect prices to be high. I am hoping that by ATI showing Nvidia how well their cards sold when they were priced right. That there won't be huge prices like back in the day with 8800's 2900XT's
I was expecting more of an even price battle between the two companies from now on. But if Nvidia doesn't get Femi out soon.... you are right we might see ATI turn in to a form of Nvidia's old self.
Now that was the financial part. But with the release of this story and others like it...... It seems that Femi might be falling down really bad. And only one real GPU maker at the moment is not good for anyone. Because if FEMI fails there is no reason to increase performance either.

I'm still really hoping these Femi cards come out soon!
 
I'm quite convinced Fermi is going to be a monster of a card. Now that we have the numbers, we can do the math and the sum = awesome! I'm just hoping they aren't too expensive, because if this thing performs like I think it will on paper I really, really want one. :)
 
I'm quite convinced Fermi is going to be a monster of a card. Now that we have the numbers, we can do the math and the sum = awesome! I'm just hoping they aren't too expensive, because if this thing performs like I think it will on paper I really, really want one.

That's what we were kind of describing when we were talking about the 2900XT though. While the numbers are impressive so were they. It could turn out to be a flop.
Although just like you I hope not. I hope it is a beast......... But I also hope that it is not so one sided like the last gen. GTX vs 4800 Series. I am hoping that both sides are very comparable because that would be the best for prices on both fronts.

But if those specs are true........ I think it might be a repeat of last GEN.

Wizzard's chart and 80-100 in case of the GTX360. HD5870 is at 71%...

So with being said ....... the thought here is that in theory a single GTX 360 might be as powerful a 5970? God for the sake of competition I sure hope not! Because if that were the case there would be know reason for ATI to even make a card anymore. Then we would have problems again!
 
Last edited:
That's what we were kind of describing when we were talking about the 2900XT though. While the numbers are impressive so were they. It could turn out to be a flop.

Let's hope not. The 2900XT basically ruined ATi. I'm going to stay optimistic here.
 
That's what we were kind of describing when we were talking about the 2900XT though. While the numbers are impressive so were they. It could turn out to be a flop.
Although just like you I hope not. I hope it is a beast......... But I also hope that it is not so one sided like the last gen. GTX vs 4800 Series. I am hoping that both sides are very comparable because that would be the best for prices on both fronts.

But if those specs are true........ I think it might be a repeat of last GEN.

+1
 
After really looking and thinking about the specs that were released with the pictures. While I do think these cards are going to be very powerful (and maybe out power the current 5800 Series) I don't think that the Femi GTX380 will beat a 5970 buy 15% and the GTX360 Come between the 5970 and the 5870 possibly even being as powerful as the 5970.
I just don't see that happening. After going back and reading the article...... I noticed while they released the possible numbers they didn't really comment on the performance. When Femi was first really talked about back right before the release of the 5870 Nvidia informants said that "they are quite confident that it will out perform ATI's newest offerings." While that still might be the case I don't think it is to the point that they originally thought.
Because if it did why wouldn't they run around stating it? Not necessarily give the numbers but Nvidia would be bragging a lot more then they are now.
I do think that these are very impressive specs!!!! But I think the Idea that the GTX 380 will beat a 5970 by 15% is just a great PR campaign.
I just can't see it happening........with or with out that chart.
 
no new news on fermi i see:ohwell:
 
That's what we were kind of describing when we were talking about the 2900XT though. While the numbers are impressive so were they. It could turn out to be a flop.

Coincidences stop there though. R600 was a completely new design that went from 48 powerful pixel shaders + 8 vertex shaders= 56 shader cores to 64 powerful shader cores (5-way). Fermi is moving from well known and familiar 240 powerful scalar cores to well known and familiar 512 powerful scalar cores with improved parallelism and efficiency (dual scheduler, concurrent kernel execution...). R600 was fast compared to previous generation anyway, it's just that Nvidia went from 24 pixel shaders to 128 SPs running at 2x the speed, in recent games G80/G92 actually is 3x-4x times faster than the 7900GTX (X1950XTX is almost twice as fast). So it's different, nothing suggests a change for the worse. It could be a flop, but there are no signs, with R600 there were some signs for those who knew what to look for.

I just don't see that happening. After going back and reading the article...... I noticed while they released the possible numbers they didn't really comment on the performance. When Femi was first really talked about back right before the release of the 5870 Nvidia informants said that "they are quite confident that it will out perform ATI's newest offerings." While that still might be the case I don't think it is to the point that they originally thought.
Because if it did why wouldn't they run around stating it? Not necessarily give the numbers but Nvidia would be bragging a lot more then they are now.
I do think that these are very impressive specs!!!! But I think the Idea that the GTX 380 will beat a 5970 by 15% is just a great PR campaign.
I just can't see it happening........with or with out that chart.

I think quite the opposite from that sentence. Until now it didn't make sense to me. With those specs they were doubting they would be faster? And what's more, they would share their doubts? What a marketing move... IMO he was refering to the HD5970 when he said that (in fact, IIRC they did mentioned Crossfire scaling in the same statement, as "we've seen their performance and scaling, I think we are faster"). Keep in mind that it's been long since AMD follows the 2 "small" dies against one big die strategy, they don't make a differentation between single and dual-GPU (at least that's their goal), so it has to come a time when Nvidia has to start trying to make their 1 die faster than AMD's 2 dies. Not only on their labs, but also on their public statements, overall their company goal has to be that.

Just my opinion. :toast:
 
I have no brand loyalty so i will buy whatever is the best mid range gear. Its only common sense to wait to see what Nvidia will bring that way even if it sucks or is too pricey can fall back on hopefully cheaper ATi cards.

Jesus lets hope the Fermi core is not like the HD2000 series as that had good paper stats but sucked, i almost bought a 2900pro then a month or so later the 8800gt came out and i was like yeeeesh like M.Bison!
 
I think quite the opposite from that sentence. Until now it didn't make sense to me. With those specs they were doubting they would be faster? And what's more, they would share their doubts? What a marketing move... IMO he was refering to the HD5970 when he said that (in fact, IIRC they did mentioned Crossfire scaling in the same statement, as "we've seen their performance and scaling, I think we are faster"). Keep in mind that it's been long since AMD follows the 2 "small" dies against one big die strategy, they don't make a differentation between single and dual-GPU (at least that's their goal), so it has to come a time when Nvidia has to start trying to make their 1 die faster than AMD's 2 dies. Not only on their labs, but also on their public statements, overall their company goal has to be that.

Just my opinion.

I don't think I'm following you to the TEE but I will still try to respond. So If I don't get the my answer correct please clarify what you said.
So what your telling me is that ATI/AMD doesn't differentiate their Double Die cards from a single die. And you believe they said that sentence just to lour in ATI into believing that they had Nvidia's Femi on the run?
And you are saying that for Nvidia we will see a GTX380(or varients of it) being their top card. And ATI's Duel GPU 5970 as theirs.
So you don't think that Nvidia will respond with a Duel GPU them selves? Something like a GTX395?

If that is what you are saying....... I could see why Nvidia would use that tatic against ATI. Making ATI think Nvidia is worried. Also if they used just one large die instead of two Nvidia might be able to keep the price down.

but with that said these new Gen cards are really powerful. IMO I still can't see it happening. (being out powered by a single GPU) If you remember just recently when the 5800 series was about to release...... The specs looked like that it should kill the 4870x2 and alos the GTX 295. When the card eventually released it came close but it wasn't as great as everyone thought it would be.
That same thing seems to happen every time any Video card comes out. Going back from the GTX200's to the 4800 Series to the 9000GTX's. The only one that happened to be right on the mark IMO was the 8800 series by Nvidia. The whole point is that every new card looks like a killer on paper. But there are allot of thing contributing to it....... Not only does it 50% depend on hardware but the other 50% depends on Drivers as well.

Now I won't go any further with that because again...... It might just be me reading it wrong or maybe you just were typing really fast but I'm not following you completely and want you to better explain yourself so I don't get it wrong:)

I will admit that some of my thoughts come from Red team brand loyalty. But to be honest I am always for any company that takes the technology to the next level. Because of course that will always make things keep pushing it to the next level!

After thought....... you seem to be someone who knows a great deal about Nvidia Cards and the company. What is your opinion on the prices that they should bring these great cards out at? And if since Nvidia from what I heard seem's to be in financial problems what do you think would happen to them if they priced them too high?
 
Last edited:
Well heres my 2 cents, nvidia's fermi cards sound awesome, just have to wait and see, and i hope ATI release a HD5990 4GB card, that would be awesome:D:D:D
 
I don't think I'm following you to the TEE but I will still try to respond. So If I don't get the my answer correct please clarify what you said.
So what your telling me is that ATI/AMD doesn't differentiate their Double Die cards from a single die. And you believe they said that sentence just to lour in ATI into believing that they had Nvidia's Femi on the run?
And you are saying that for Nvidia we will see a GTX380(or varients of it) being their top card. And ATI's Duel GPU 5970 as theirs.
So you don't think that Nvidia will respond with a Duel GPU them selves? Something like a GTX395?

If that is what you are saying....... I could see why Nvidia would use that tatic against ATI. Making ATI think Nvidia is worried. Also if they used just one large die instead of two Nvidia might be able to keep the price down.

but with that said these new Gen cards are really powerful. IMO I still can't see it happening. (being out powered by a single GPU) If you remember just recently when the 5800 series was about to release...... The specs looked like that it should kill the 4870x2 and alos the GTX 295. When the card eventually released it came close but it wasn't as great as everyone thought it would be.
That same thing seems to happen every time any Video card comes out. Going back from the GTX200's to the 4800 Series to the 9000GTX's. The only one that happened to be right on the mark IMO was the 8800 series by Nvidia. The whole point is that every new card looks like a killer on paper. But there are allot of thing contributing to it....... Not only does it 50% depend on hardware but the other 50% depends on Drivers as well.

Now I won't go any further with that because again...... It might just be me reading it wrong or maybe you just were typing really fast but I'm not following you completely and want you to better explain yourself so I don't get it wrong:)

I will admit that some of my thoughts come from Red team brand loyalty. But to be honest I am always for any company that takes the technology to the next level. Because of course that will always make things keep pushing it to the next level!

After thought....... you seem to be someone who knows a great deal about Nvidia Cards and the company. What is your opinion on the prices that they should bring these great cards out at? And if since Nvidia from what I heard seem's to be in financial problems what do you think would happen to them if they priced them too high?

What I was saying is that AMD shifted their strategy towards multi-GPU as the way to create the high-end parts. In theory, their desire is to create small dies to serve the mainstream and performance price points and then put them together to create the high-end. When they presented this strategy, they said that soon they would be putting 4, 6, 8 small dies in order to create different performance levels. This is in absolute contrast to Nvidia's atrategy of creating the bigger modular design they can and then cut it down to create the mainstream products.

My comment was about that divergence in focus. At Nvidia when they start designing their chip they have to aim for the dual-GPU card to be on the safe side, even if they are going to create a dual-GPU card themselves, because when the project starts, 3-4 years before it reaches stores, they don't know what AMD will do. What if AMD puts 3 or 4 dies on a card, for example?

About pricing, it's really hard to say. We can make a guesstimate about how much it will cost Nvidia to create the cards, but we don't know how much they will charge, it will depend on the market, demand, performance, etc.. About production costs, once that 40nm yields improve, they will be cheaper to produce than GT200 cards (smaller die, 384vs512 bit), so if needed or if they simply want to, they can sell them at very similar prices as Ati cards* without sacrificing profits like they did with GT200.


*Reasons being:

- HD5xxx cards cost more than HD4xxx card to produce: bigger die, fastest GDDR5 memory.
- Nvidia will apparently use cheaper slower GDDR5 memory, that will aleviate the price difference a bit.
- Nvidia will sell Fermi Tesla cards (technically the same thing) in the HPC market and depending on how well they do there, they will be able to adapt their profit requirements on the GPU market and compete better. Profits-per-card are 10-20 times bigger in the HPC market, so in essence every Tesla card sold could aleviate the need to make a profit in 10-20 GeForce cards if really required.

One thing is sure, they will always be competitive, at least on cards that directly compete with AMD cards and the faster ones will be forced to come down too or they will become worthless. This makes slower cards to adapt again and the ball keeps rolling.

Nvidia having faster cards doesn't hurt competition as much as people think. The GTX260 did come down in price a lot (so did the 8800GT at the time) because it competed with the HD4870, only the GTX280/285 remained expensive AND if the prerequisite for competition is that Nvidia doesn't have a faster card, then the undeniable truth is that GTX280/285 and that performance level would have never existed in the first place.

The feel of lack of competition is just subjective and abstract. It's that people look at the GTX285 and want it and think there's no competition at that price point because it's expensive, that it doesn't make sense in a performance/price basis and hence they think it would have been better if Nvidia didn't outperform AMD. Well, but if GTX285 never existed (or performed like a HD4870) that performance level would have never existed in the first place and HD4870/GTX260 (or GTX280 with same performace as HD4870) would probably cost much more than they did, they would both be priced as premium cards instead of "second in charge" cards. What I'm trying to say is that the prerequisite for competition is that Nvidia releases a card with similar performance as the HD5870, how many cards they have above that level is irrelevant. The ideal thing is not that Nvidia fails to outperform HD5870 now. the idel situation would have been if the HD5870 was already in the performance point where the GTX380 specs suggest it will land.

Sorry for the long response.
 
Status
Not open for further replies.
Back
Top