# NVIDIA GT300 Already Taped Out



## btarunr (May 17, 2009)

NVIDIA's upcoming next-generation graphics processor, codenamed GT300 is on course for launch later this year. Its development seems to have crossed an important milestone, with news emerging that the company has already taped out some of the first engineering samples of the GPU, under the A1 batch. The development of the GPU is significant since it is the first high-end GPU to be designed on the 40 nm silicon process. Both NVIDIA and AMD however, are facing issues with the 40 nm manufacturing node of TSMC, the principal foundry-partner for the two. Due to this reason, the chip might be built by another foundry partner (yet to be known) the two are reaching out to. UMC could be a possibility, as it has recently announced its 40 nm node that is ready for "real, high-performance" designs. 

The GT300 comes in three basic forms, which perhaps are differentiated by batch quality processing: G300 (that make it to consumer graphics, GeForce series), GT300 (that make it to high-performance computing products, Tesla series), and G200GL (that make it to professional/enterprise graphics, Quadro series). From what we know so far, the core features 512 shader processors, a revamped data processing model in the form of MIMD, and will feature a 512-bit wide GDDR5 memory interface to churn out around 256 GB/s of memory bandwidth. The GPU is compliant with DirectX 11, which makes its entry with Microsoft Windows 7 later this year, and can be found in release candidate versions of the OS already.

*View at TechPowerUp Main Site*


----------



## alexp999 (May 17, 2009)

One disadvantage of being a news poster and keeping up with the news is that I want stuff months before it even comes out 

I'm guessing these will be out around christmas along with i5 and Windows 7.


----------



## DrPepper (May 17, 2009)

I wonder what ATI's comeback will be like. Also I wonder if nvidia will bring out a dual gpu card.


----------



## btarunr (May 17, 2009)

ATI has RV870, although nothing spectacular from its earliest specs. 



alexp999 said:


> One disadvantage of being a news poster and keeping up with the news is that I want stuff months before it even comes out



Another is that sometimes we're like the "Page 3" reporter that hangs out in the city's elite social circle, but only to report on who spilled his drink, or hung out with whom.


----------



## KainXS (May 17, 2009)

It looks like ATI might take nvidia's performance crown based on those specs


----------



## qwerty_lesh (May 17, 2009)

Whatever high end model this becomes, I want one, I want it made by Zotac and in my PC


----------



## DrPepper (May 17, 2009)

KainXS said:


> It looks like ATI might take nvidia's performance crown based on those specs



What is ATI's specs ?


----------



## PlanetCyborg (May 17, 2009)

oh poor ATI!! i dont think that the RV870 can beat that monster!


----------



## KainXS (May 17, 2009)

32 rops
around 2100 shaders
GDDR5

thats what the rumors say but based on ati's previous releases its more than likely real

but if I had to guess I would say at least 2400 shaders


----------



## PlanetCyborg (May 17, 2009)

KainXS said:


> 32 rops
> around 2100 shaders
> GDDR5
> 
> ...



that is problay the 5870x2 not the 5870!


----------



## alexp999 (May 17, 2009)

Based on current architecture comparisons, ATI need about 3.5-4 times as many shaders to match nvidias performance. So around 2,000 SP on the 5 series would seem plausible assuming a similar architecture is used.


----------



## Howard (May 17, 2009)

stop posting news again, just release the product!
my future rig is waiting for this beast!
eVGA X58 759 + Dominator GT 2000Mhz + this beast = invincible!!!  (at least 1-2years) lol


----------



## happita (May 17, 2009)

PlanetCyborg said:


> that is problay the 5870x2 not the 5870!



I was gonna say the same, but I really do hope they rework the 5 series from the ground up. Haven't they had the same architecture since the 2k series? For ex. 3k is just a die shrink among other things, and 4k is another speed increase, no major changes with the exception of the 4890 being ridiculously high-binned at these 1GHz speeds. 

But, nvidia seems to have another 8800-like series of video cards out for the next round...if thats the case, I'm not missing out this time, I may switch.


----------



## theorw (May 17, 2009)

Despite of being an ati user all along, i have to say that 256 gbs plus 512 st proc will give ati A HAAAAAAAAAARD time!!!


----------



## Cheeseball (May 17, 2009)

3K wasn't just a die shrink. An OC'd HD 3850 or standard HD 3870 can destroy a HD 2900XT (the flagship of 3K).

512-bit GDDR5 is sexy.


----------



## mtosev (May 17, 2009)

possibly my next card with my i7 system that im getting soon


----------



## LaidLawJones (May 17, 2009)

happita said:


> But, nvidia seems to have another 8800-like series of video cards out for the next round...if thats the case, I'm not missing out this time, I may switch.




I concur. I am tired of looking at the the number two spot the vast majority of the time. If ATI can't beat the green team this time, and by a serious margin, then I am going to stuff  NV/Intel into my spider case.


----------



## lemonadesoda (May 17, 2009)

It is getting increasingly difficult to understand how performance will scale only counting the "number of shaders". As architecture moves from SISD to SIMD to MIMD the ability to predict how this will impact CAD, or DX9, 10 or 11 rendering is very hard, esp. as you layer shader effects like FSAA etc.

It may be that there is a bigger win in resolution, ie. can 2560x1600 or a bigger win in shader effects. ie 16xFSAA etc.  Only benchmarking will tell.

I wonder with 3 versions of the GPU whether this will impact CUDA abilities. If it does, ie different CUDA capabilities on each, then this will spell disaster for standardising CUDA (and Physx on CUDA) enhancements.

Looking forward to more news...


----------



## toyo (May 17, 2009)

Let's hope ATI/AMD will be up to the task or this cards will be 500$+ (if not more - not talking about the quadro/tesla)


----------



## lemonadesoda (May 17, 2009)

nV has got to find some way to claw back all the money they lost in the last 18 months due to failed laptop GPUs... and problems with their insurance providers.

To get that money clawed back, expect the G300 to be WHOOPASS, but also expect a very high premium card, at least as pricey as a GTX 295


----------



## icon1 (May 17, 2009)

oh, this card will give ATI some headache..


----------



## crazy pyro (May 17, 2009)

As long as the midrange is around the hundred quid mark I don't mind, high end will always be horrifically highly priced.


----------



## a_ump (May 17, 2009)

i guess nvidia decided to step it up this time and kill ATI, they were surprised by HD 4XXX series so the way i see it they're doing all they can to beat ATI back down instead of ATI being on NV's ass. I wonder though, is it possible that there be too many shaders? that all of them can't be utilized kinda like a quad core is barely utilized in games?


----------



## Animalpak (May 17, 2009)

No way for ati to beat nvidia in performance forget this !!

Nvidia has THE DRIVERS...

ATi has nothing compared to nvidia in terms of drivers optimization.


----------



## a_ump (May 17, 2009)

Animalpak said:


> No way for ati to beat nvidia in performance forget this !!
> 
> Nvidia has THE DRIVERS...
> 
> ATi has nothing compared to nvidia in terms of drivers optimization.



?dude where have u been? ATI for the past year has been all over nvidia's ass, they had the performance crown for a while with the HD 4870x2, and yea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back. And yea nvidia does tend to have better drivers at launch than ATI but they also have a lot more access to games(TWIMTBP). 

Drivers are important but with this release i thk we'll see the past repeat itself. Not like 2900XT vs 8800GTX, but more like 9800GTX vs HD 3870. The GTX 285 is one hell of a card with 240 SPU's 512-bit GDDR3. GT300 is going to have more than twice as many SPU's, and twice the bandwidth by moving to GDDR5. I just can't see the performance from the chip not destroying ATI's RV870 which i estimate by the specs, will only perform 25-35% faster than HD 4870.


----------



## R_1 (May 17, 2009)

The new HD5870 will be like 2xHD4770 in crossfire on single die , but equipped with 265bit memory bus and faster clock. So G300 must be pretty impressive to beat this little beast.


----------



## DrPepper (May 17, 2009)

KainXS said:


> 32 rops
> around 2100 shaders
> GDDR5
> 
> ...



I've only seen estimates of about 1200 for the RV870 so assuming an x2 R800 would be 2400 would be about right.


----------



## a_ump (May 17, 2009)

i hope the 1200 estimated shaders on the RV870 are fine tuned and better than the ones on the RV770, cause 512 nvidia shaders, even though they are difference from teh 240 on GT200, i can only assume nvidia to improve their shaders not use more but less efficient shaders, and thus it's going to rape. So ATI i hope has done one hell of a job on the RV870, i wonder if the sideport that was never activated on the HD 4870x2 will be on the HD 5870x2 and provide a decent performance jump, it would have to i'd think as then the latency from using the onboard crossfire chip would be negated. Dam i wanna see some benchies ugh


----------



## djisas (May 17, 2009)

a_ump said:


> ?dude where have u been? ATI for the past year has been all over nvidia's ass, they had the performance crown for a while with the HD 4870x2, and yea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back. And yea nvidia does tend to have better drivers at launch than ATI but they also have a lot more access to games(TWIMTBP).
> 
> Drivers are important but with this release i thk we'll see the past repeat itself. Not like 2900XT vs 8800GTX, but more like 9800GTX vs HD 3870. The GTX 285 is one hell of a card with 240 SPU's 512-bit GDDR3. GT300 is going to have more than twice as many SPU's, and twice the bandwidth by moving to GDDR5. I just can't see the performance from the chip not destroying ATI's RV870 which i estimate by the specs, will only perform 25-35% faster than HD 4870.



The 1GHz HD4890 already proves capable of beating the 285 in several games, i think those are already 25-35% faster that 4870...


----------



## cooler (May 17, 2009)

gt300 spec look nice

how many percentage jump will this gpu bring compare to current gpu.


----------



## a_ump (May 17, 2009)

grid, stalker, fallout 3, red alert 3 are the only games i saw on [/url=http://www.xbitlabs.com/articles/video/display/radeon-hd4890_7.html#sect0]xbitlab's review[/url] that the HD 4890 outperformed the GTX 285, and it's performance over the HD 4870 is more like 5-15%. but i see your point as the HD 5870(i thk it'll be that) will have 400 more shaders, twice the rop's, and 8 more TMU's than the HD 4870 performance should be 40% more at least, plus the memory clock bump to 1100 from 900.


----------



## DonInKansas (May 17, 2009)

40nm?  That's hot.

Or not hot?  Hope temps are nice........


----------



## iStink (May 17, 2009)

djisas said:


> The 1GHz HD4890 already proves capable of beating the 285 in several games, i think those are already 25-35% faster that 4870...


here let me fix that statement for you:



djisas said:


> The *OVERCLOCKED *1GHz HD4890 already proves capable of beating the *STOCK* 285 in several games, i think those are already 25-35% faster that 4870...


----------



## enaher (May 17, 2009)

I think it sounds impressive, but i want to see performance remember the people that got the 260GTX 192 for 350+ and then the 4870 came and woop its ass for less $, something ive learned from the last year dont buy PR and SPECS, remeber Phenom, The 2900XT, and to some extent the GT200 for the price they had and getting there ass handed by the 9800GX2, dont get me wrong i want a awsome product from Nvidia, but id rather have an awsome product from both Ati & Nvidia to drive prices down and quality up


----------



## happita (May 17, 2009)

a_ump said:


> grid, stalker, fallout 3, red alert 3 are the only games i saw on [/url=http://www.xbitlabs.com/articles/video/display/radeon-hd4890_7.html#sect0]xbitlab's review[/url] that the HD 4890 outperformed the GTX 285, and it's performance over the HD 4870 is more like 5-15%. but i see your point as the HD 5870(i thk it'll be that) will have 400 more shaders, twice the rop's, and 8 more TMU's than the HD 4870 performance should be 40% more at least, plus the memory clock bump to 1100 from 900.



Is that clock confirmed? Because if not, then it could be a new architecture that stays around 900 but with better performance...ie. 4870 to 4890 with the better quality materials used and whatnot, but hopefully to a greater extent where we will really see AA not even affect games at any resolution, thats what I hope 
Yea and let's not forget the OCability from the newer 40nm process, and seeing that ATI has a bit more experience with 4770, I predict that it will be more efficient with minimal leakage.


----------



## a_ump (May 17, 2009)

i agree, and 40nm does seem to allow great oc's. HD 4770 750 stock, and most clocks reach 850+ from what i've seen, and the memory from 600-900 is very common as well. I was under the impression the RV870 was a very finely tuned RV770. i hope more info is released soon on the RV870, i could care less what nv brings cause just hearing these spec's makes me weary for ATI.


----------



## Steevo (May 17, 2009)

All these cards are bandwidth starved, all ATI really needs to do to open up the cores they currently have is to add a larger cache on die,a nd shrink the die. A $99 4770 that performes better than a equal priced "250, AKA 9800, AKA 8800..." and overclocks better, runs with less energy use, and cooler.



Hrm.....seems like Nvidiots are still floating the Nvidia boat.


As to drivers, yes Nvidia makes you feel good by realeasing one driver a week in multiple forms so you can spen all your time uninstalling, reinstalling, and trying to see if it fixes things, and then end up using whatever prevents the issues you have or experiance, and with ATI, you don't get all the fun of a bunch of drivers to try, so you just use the one that works. Damn, using only one driver that works and not trying three or four. Seems stuipd to me to, oh well, you keep trying while I go play games.


----------



## Steevo (May 17, 2009)

iStink said:


> here let me fix that statement for you:















I know words are  hard, so here are pictures. The ones above are performance per dollar.

Do you understand this?


Below are the performance at STOCK clocks for the models.












So let me see, spend more money for less performance............ GTX285 FTW!!!!!!


----------



## Blacksniper87 (May 17, 2009)

Fanboi much? geez mate i don't think there is such a thing as a bigger fanboi looser comment. Here's an idea what if, if you think something about nvidia you put in logical sentences and not call people who use there cards Nvidiots.

Hey mate this takes the cake the majority of your graphs are performance per dollar and then you have graphs that are relative performance all of which nvidia wins so do you have a point at all? Lastly i may be relatively new here but from what i know your not allowed to double post.


----------



## roofsniper (May 17, 2009)

Blacksniper87 said:


> Fanboi much? geez mate i don't think there is such a thing as a bigger fanboi looser comment. Here's an idea what if, if you think something about nvidia you put in logical sentences and not call people who use there cards Nvidiots.
> 
> Hey mate this takes the cake the majority of your graphs are performance per dollar and then you have graphs that are relative performance all of which nvidia wins so do you have a point at all? Lastly i may be relatively new here but from what i know your not allowed to double post.



you actually have to read the graphs to understand them. 4890 is 21% more performance per dollar than the gtx285. thats a big difference. gtx285 is only 5% more powerful. you pay a lot more for that slight performance bonus.


----------



## KainXS (May 17, 2009)

I highly doubt ATI would release the 58XX series with only 1200 sp's, I think they will take advantage of the die strink and double up on sp's like they did with the 48XX series before.

I think ATI already predicted that the GT300's would have these specs and 1200 sp's would probably give nowhere near the performance of a GT300 with only 1200sp's.


----------



## Blacksniper87 (May 17, 2009)

yes your point? This is different to when in the history of the computer industry? You always pay significantly  more for the highest end product regardless of the slight performance difference. Also i did read the graphs and what i did notice was, there was no overclocks of anything else apart from the 4890 so ..............


----------



## Darren (May 17, 2009)

Blacksniper87 said:


> yes your point? This is different to when in the history of the computer industry? You always pay significantly  more for the highest end product regardless of the slight performance difference. Also i did read the graphs and what i did notice was, there was no overclocks of anything else apart from the 4890 so ..............



Would you pay 21% more for a car that runs 5% faster? NO

So why would anyone with a brain do it for a GPU or a CPU?


----------



## DrPepper (May 17, 2009)

KainXS said:


> I highly doubt ATI would release the 58XX series with only 1200 sp's, I think they will take advantage of the die strink and double up on sp's like they did with the 48XX series before.
> 
> I think ATI already predicted that the GT300's would have these specs and 1200 sp's would probably give nowhere near the performance of a GT300 with only 1200sp's.



The number of SP's aren't half as important as the way the shaders handle instructions. For example nvidia uses a different method to ati and thats why nvidia can use less, more powerful shaders than ATI. Whose to say ATI will continue with thier current shader model and change it so they can use 1200sp's and get the same performance as 2400. Anyway back to the point ATI engineers prefer to find cheap methods of adding performance and compared to nvidia's tactic of adding more.


----------



## theorw (May 17, 2009)

a_ump said:


> ?dude where have u been? ATI for the past year has been all over nvidia's ass, they had the performance crown for a while with the HD 4870x2, and yea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back. And yea nvidia does tend to have better drivers at launch than ATI but they also have a lot more access to games(TWIMTBP).
> 
> Drivers are important but with this release i thk we'll see the past repeat itself. Not like 2900XT vs 8800GTX, but more like 9800GTX vs HD 3870. The GTX 285 is one hell of a card with 240 SPU's 512-bit GDDR3. GT300 is going to have more than twice as many SPU's, and twice the bandwidth by moving to GDDR5. I just can't see the performance from the chip not destroying ATI's RV870 which i estimate by the specs, will only perform 25-35% faster than HD 4870.



+1
Animalpak WAKE UP!!!


----------



## Nick89 (May 17, 2009)

800$


----------



## djisas (May 17, 2009)

Do u guys realize that a chip with HALF, 50% the GT200 die size a few million's less transistors is only 5% slower that such beast??
HD5870 doest need to beat the gt300 on its game, just do like the 4870 did for half the price you get 70-80% of the performance, if its not enough, slap 2 of those together for the same price and get 140-160% the performance...
But what if they surprise us and create something that really competes with it??


----------



## DrPepper (May 17, 2009)

I'm going to sit and watch this I haven't seen a fanboy fight in a while


----------



## djisas (May 17, 2009)

Even thought im an ati fanboy, ive already had 2 nvidea cards and 3 ati cards...
When talking about drivers and having experienced both id say i like ati drivers more...
And i say im not gunna waste more than 200 for a card like i did with the 2900XT, its not like there where any cheaper decent card back then, but wasting over 300 i was out of my mind, then i wasted more 200 on 8800GTS G92 but sold for 150 and bought the hd4850 i have now for 180, i had finally done a good deal...
If the next hd5850 is under 200, count me in for a custom one...
I dont think ill side with the green side any more...


----------



## hat (May 18, 2009)

Sweet, another hungry-hungry hippo.


----------



## a_ump (May 18, 2009)

DrPepper said:


> The number of SP's aren't half as important as the way the shaders handle instructions. For example nvidia uses a different method to ati and thats why nvidia can use less, more powerful shaders than ATI. Whose to say ATI will continue with thier current shader model and change it so they can use 1200sp's and get the same performance as 2400. Anyway back to the point ATI engineers prefer to find cheap methods of adding performance and compared to nvidia's tactic of adding more.



yep, and with the GT300, nvidia are doing an overhaul on how their shaders compute, so i wonder if doubling their shaders will equate to even +50% performance, dam i just wanna see some benchies nvidia's chip has me all hyped.


----------



## KainXS (May 18, 2009)

lol, i messed up I gots my info from the wrong source I was looking for  oh well, it looks like it will be 1200sp's and 32 rops, my mistake, thats what happens when you look at an old source and try to defend it:shadedshu

I was looking for this one
http://www.hexus.net/content/item.php?item=18240

but I have been hearing that ATI redeveloped their dual gpu pcb's for better bandwidth utilization


so . .  . nvidia keeps their crown, . . . nice i guess good for price wars


----------



## Blacksniper87 (May 18, 2009)

Darren said:


> Would you pay 21% more for a car that runs 5% faster? NO
> 
> So why would anyone with a brain do it for a GPU or a CPU?



I'm not saying you should buy it, if you read the post properly you would realise i am saying that nvidia is at an unfair disadvantage due to the fact that there is an overclocked 4890 in their and no overclocked nvidia cards. Also you would notice that your analogy is severly flawed, because lots of people do buy the car that is 5% faster and in some cases costs more then double the slower one. The high end suff is what makes most computer companies the profit or the money back for R&D so stop bitching and it would probably be a good idea for the fanbois above me not to post fanboi crap on an nvidia news story because all it says is that you are worried your precious AMD/ATI will not win the next round as they have this one.

DISCLOSURE: I am not a fanboi i will buy whatever the best card is in my price range so please i really don't need the flaming.

EDIT: Just in case on goes the fireproof suit


----------



## a_ump (May 18, 2009)

KainXS said:


> lol, i messed up I gots my info from the wrong source I was looking for  oh well, it looks like it will be 1200sp's and 32 rops, my mistake, thats what happens when you look at an old source and try to defend it:shadedshu
> 
> I was looking for this one
> http://www.hexus.net/content/item.php?item=18240
> ...



yep, i'm assuming that better bandwidth utilization is the sideport that is disabled on HD 4870x2 will be enabled on the HD 5870x2. direct communication between the cores instead of having to go through the onboard xfire chip.


----------



## Darren (May 18, 2009)

Blacksniper87 said:


> I'm not saying you should buy it, if you read the post properly you would realise i am saying that nvidia is at an unfair disadvantage due to the fact that there is an overclocked 4890 in their and no overclocked nvidia cards. Also you would notice that your analogy is severly flawed, because lots of people do buy the car that is 5% faster and in some cases costs more then double the slower one. The high end suff is what makes most computer companies the profit or the money back for R&D so stop bitching and it would probably be a good idea for the fanbois above me not to post fanboi crap on an nvidia news story because all it says is that you are worried your precious AMD/ATI will not win the next round as they have this one.
> 
> DISCLOSURE: I am not a fanboi i will buy whatever the best card is in my price range so please i really don't need the flaming.
> 
> EDIT: Just in case on goes the fireproof suit



This is where you are going wrong, its just an overclocked product, there was modifications to the physical layout of the 4890.

Nvidia is not at an unfair advantage at all, with that logic you can say that ATI is at an unfair advantage in the midrange due to the Nvidia's GTS250 series being an overclocked 9800 GTX+. So its ok for Nvidia sell a GTS 250 which is an overclocked 9800 GTX+ but its not ok for ATI to overclock their 4870 and call it a 4890?

One rule for Nvidia and a different rule for ATI huh?


In fact in both cases the 4890 and GTS 250 are more than just mere overclocks and hence its very fair! - the GTS 250 uses 55nm opposed to 65mn core and the 4890 has an increased ring of 3 million transistors more than the 4870 and hence its not JUST and overclock and therefore no disadvantage.


----------



## DrPepper (May 18, 2009)

KainXS said:


> lol, i messed up I gots my info from the wrong source I was looking for  oh well, it looks like it will be 1200sp's and 32 rops, my mistake, thats what happens when you look at an old source and try to defend it:shadedshu
> 
> I was looking for this one
> http://www.hexus.net/content/item.php?item=18240
> ...



Not a big deal anyway I expect sources to change all the time. I think 1200sp is more than enough. It adds performance without the need to completely rework the design of the GPU and it keeps the costs lower.


----------



## ownage (May 18, 2009)

Come on guys. GT300 is taped-out when a reliable source says so. Therefore there is only a little change its taped-out.


----------



## Blacksniper87 (May 18, 2009)

Darren said:


> This is where you are going wrong, its just an overclocked product, there was modifications to the physical layout of the 4890.
> 
> Nvidia is not at an unfair advantage at all, with that logic you can say that ATI is at an unfair advantage in the midrange due to the Nvidia's GTS250 series being an overclocked 9800 GTX+. So its ok for Nvidia sell a GTS 250 which is an overclocked 9800 GTX+ but its not ok for ATI to overclock their 4870 and call it a 4890?
> 
> ...




I am saying if you can read, is that their is an overclocked version in the graph. So bloody read the graph properly and realise you are exhibiting a brillant exampe of fanboism. As i said before your analogy was crap. 

NOTE: I did not at any stage say that the 4890 was eith an overclock ofthe 4870 a rebadge or any other statement i simply observed the graphs and responded.


----------



## [I.R.A]_FBi (May 18, 2009)




----------



## Blacksniper87 (May 18, 2009)

I have to ask what this image is meant to be?


----------



## Valdez (May 18, 2009)

I hope it's clear, that ati doesn't want to make a big gpu to compete with nvidia. This is where ati's and nvidia's strategies differs, nvidia wants one, but a very powerful gpu, and ati wants to go multicore. So they make a small, but very efficient gpu, and release 1,2 or 4 gpu cards. 1 rv870 can't compete with gt300, but it doesn't have to. That's the x2's task. 
And we didn't even spoke about the x4 variant (if the gossips of MCM are true) .


----------



## newtekie1 (May 18, 2009)

@Darren:

1.) The overclocked HD4890 can beat the stock GTX285 in a few games, but overall the GTX285 still the better performer.  The original comment, and this thread, has little to do with price or performance per dollar.  If you haven't realized that price does not increase at the same rate as performance, and that higher performing cards come at a price premium, you really have not business in a discussion about performance or price.

2.) You seem to really like to worry about performance per dollar.  And while you were trying to sound all high and mighty, and insult the other members, you seemed to have completely failed to notice that the GTX275 offers a better performance per dollar than the stock HD4890, and equals the performance per dollar of the overclocked card in the graphs you posted...  Oddly enough, it also straight up outperformed the stock hard, and matched the overclocked...



a_ump said:


> yea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back



I always find this argument odd...

So your argument is that nVidia needed a card with 2 GPUs to take the performance crown from ATi, and this is some how a negative for nVidia...

But ATi needed a card with 2 GPUs to take the performance crown nVidia's single GPU.  How is that better?  Why is it bad for nVidia to need two GPUs to beat ATI's two GPUs, but not bad for ATi to need 2 GPUs to beat nVidia's 1?


----------



## iStink (May 18, 2009)

Steevo said:


> http://i6.techpowerup.com/reviews/Powercolor/HD_4890_PCS/images/perfdollar_1680.gif
> 
> 
> http://i6.techpowerup.com/reviews/Powercolor/HD_4890_PCS/images/perfdollar_1920.gif
> ...



I know not being a raging douche bag is hard, so let me avoid turning this into an ati vs nvidia flame war and remind you that the poster I was replying to never mentioned price, and was focused on performance only. 

Actually, my initial response reminded him to always mention price when trying to compare the two cards, but I felt that if I did, he'd feel I was somehow making fun of him since only a moron would forget such a thing.

Have a nice day, dick.


----------



## enaher (May 18, 2009)

newtekie1 said:


> I always find this argument odd...
> 
> So your argument is that nVidia needed a card with 2 GPUs to take the performance crown from ATi, and this is some how a negative for nVidia...
> 
> But ATi needed a card with 2 GPUs to take the performance crown nVidia's single GPU.  How is that better?  Why is it bad for nVidia to need two GPUs to beat ATI's two GPUs, but not bad for ATi to need 2 GPUs to beat nVidia's 1?



Its bad beacuse of the monolitic strategy used by nvidia big expensive, hot and power hungry, they needed a shrink to make the GTX295, i think its bad for their bags, and thats why we are getting a single pcb GTX295 to increase their profit, lets remember they also got cheap on the cooler for the 55nm GTX 260 and actually runs a bit hotter, they got rid of the backplate of the card, they even made a simpler version of the pcb that in theory should OC a little worse...


----------



## Steevo (May 18, 2009)

Blacksniper87 said:


> I am saying if you can read, is that their is an overclocked version in the graph. So bloody read the graph properly and realise you are exhibiting a brillant exampe of fanboism. As i said before your analogy was crap.
> 
> NOTE: I did not at any stage say that the 4890 was eith an overclock ofthe 4870 a rebadge or any other statement i simply observed the graphs and responded.



How is a 1Ghz or 950Mhz product from a AIB partner any different than a AIB partner for Nvidia releasing a overclocked product?


For the record 1Ghz is within spec for these chips, and not a "overclock" part.


What I care about is performance at the end of the day. How many cried when GTA4 came out and they couldn't play it at high settings, and yet it just required more memory, so in all actuality my lowly 4850 beat a 786MB card of the green camp. I overclocked my card, spent less and played more. same for a 4890, and hell even the 4770 in my parents build. $99 that kicks the shit out of a GTS250.


The difference here is we are all talking performance per dollar, performance maximum availability, and drivers.


In almost every segment of the market ATI has Nvidia beat on price and or performance. For a actual mid range card the 4770/4850 rapes Nvidia, 4890 rapes everything but 295 but for the money of a 295 you could X-fire two 4890's and still rape it, hell a 4870 X2 is 14% more efficient per dollar.


----------



## newtekie1 (May 18, 2009)

enaher said:


> Its bad beacuse of the monolitic strategy used by nvidia big expensive, hot and power hungry, they needed a shrink to make the GTX295, i think its bad for their bags, and thats why we are getting a single pcb GTX295 to increase their profit, lets remember they also got cheap on the cooler for the 55nm GTX 260 and actually runs a bit hotter, they got rid of the backplate of the card, they even made a simpler version of the pcb that in theory should OC a little worse...



The requirement of a die shrink has nothing to do with it.  The argument is that it is somehow acceptable for ATi to use 2 GPUs to best 1 GPU, but somehow not as acceptable for nVidia to use 2 GPUs to best 2 GPUs.  I just find that very odd.

And as for the 55nm GT200 cards, they got rid of the backplate because the memory was all moved to the front of the card, so the backplate was not needed to cool the RAM anymore.


----------



## Steevo (May 18, 2009)

iStink said:


> I know not being a raging douche bag is hard, so let me avoid turning this into an ati vs nvidia flame war and remind you that the poster I was replying to never mentioned price, and was focused on performance only.
> 
> Actually, my initial response reminded him to always mention price when trying to compare the two cards, but I felt that if I did, he'd feel I was somehow making fun of him since only a moron would forget such a thing.
> 
> Have a nice day, dick.



I love you sweetie. 


I wasn't aiming this at you, but at the haters, I have Nvidia cards in use at work. I use what gets me from (figurative) point A to B the cheapest and within reason. If Nvidia ever releases a product when I am ready to buy that does better then the red counterpart, I will move to that. I will admit the biggest fuckup I had was my X1800XT, $599 that I sold a year or so after for $70. 

I don't hate Nvidia, or ATI, I hate the fanbois who believe their company of choice can do no wrong. I was also in the wrong when I bought a Prescott thinking something was being made better. I was wrong and moved to AMD. I only have one AMD at work, most are C2D's or other Intels.


Anyway, enough of the name calling.


----------



## DrPepper (May 18, 2009)

Steevo said:


> How is a 1Ghz or 950Mhz product from a AIB partner any different than a AIB partner for Nvidia releasing a overclocked product?
> 
> 
> For the record 1Ghz is within spec for these chips, and not a "overclock" part.
> ...



Erm in the £130 mark the gtx260 is a clear winner. ATI's equivelant is a 4850 which can't beat a 260 in anything.


----------



## enaher (May 18, 2009)

newtekie1 said:


> The requirement of a die shrink has nothing to do with it.  The argument is that it is somehow acceptable for ATi to use 2 GPUs to best 1 GPU, but somehow not as acceptable for nVidia to use 2 GPUs to best 2 GPUs.  I just find that very odd.
> 
> And as for the 55nm GT200 cards, they got rid of the backplate because the memory was all moved to the front of the card, so the backplate was not needed to cool the RAM anymore.



ok
simply put... quality, Usually quality wise Nvidia is better most of the time, cooler, warranty (EVGA, BFG), price/performance wise MOST of the time goes to Ati, remember im not talking quality of the gpu, these is reference on the cooler, and partners warranty and upgrade plans, But this is a big BUT, nvidia usually develops a BIGBADASS core very expensive to manufacture, then to recover the crown(taken by ati dual gpu) they make a two core solution thats very expensive, just to recover market share, and drop the ball on other things an example is my friend he had a EVGA GTX260 SC 192 and EVGA replaced it with a 55nm Wich ran hotter OC a little less and even had some cheap looking capacitors the 192 version was full SS Capacitors, im not saying Nvidia cant do a 2 core gpu and Ati can, IM SAYING they usualy dont plan to, and when they do build one, they start to loose cash, remember the couple of $ they lost recently, i like Ati & Nvidia i want to keep the fight as close as they can both with profits and quality.

EDIT: Of course they could do a 65nm version of the 295, it had nothing to do with thermals and power draw, and of course they remove the backplate cause they move the memory, and gee me thinking here they move the memory to get rid of the backplate and bring cost down.


----------



## a_ump (May 18, 2009)

newtekie1 said:


> The requirement of a die shrink has nothing to do with it.  The argument is that it is somehow acceptable for ATi to use 2 GPUs to best 1 GPU, but somehow not as acceptable for nVidia to use 2 GPUs to best 2 GPUs.  I just find that very odd.
> 
> And as for the 55nm GT200 cards, they got rid of the backplate because the memory was all moved to the front of the card, so the backplate was not needed to cool the RAM anymore.



u miss understood me, i simply stated the 1GPU part of nvidia as whenever someone mentions the HD 4870x2 besting nvidia they chime in that nvidia has the best single gpu product, so i figured i'd save myself from reading the responses by stating it myself lol, though apparently it spawned a new argument.


----------



## enaher (May 18, 2009)

a_ump said:


> u miss understood me, i simply stated the 1GPU part of nvidia as whenever someone mentions the HD 4870x2 besting nvidia they chime in that nvidia has the best single gpu product, so i figured i'd save myself from reading the responses by stating it myself lol, though apparently it spawned a new argument.



ehehe it seems everyone misunderstood, i took as to why its negative for nvidias sake to make a double gpu, its ok to make dual gpus for anyone to get the performance crown, its usually better for ati buisness model though


----------



## iStink (May 18, 2009)

Steevo said:


> I love you sweetie.
> 
> 
> I wasn't aiming this at you, but at the haters, I have Nvidia cards in use at work. I use what gets me from (figurative) point A to B the cheapest and within reason. If Nvidia ever releases a product when I am ready to buy that does better then the red counterpart, I will move to that. I will admit the biggest fuckup I had was my X1800XT, $599 that I sold a year or so after for $70.
> ...



Sorry, the whole "words are hard" comment just irritated me lol.  Sorry for the name calling. 

As far as bang for the buck goes, I'm the same way when it comes to what I buy.  I've loved the hell out of my 8800gt, but I think it's time to upgrade soon.  I've been seriously eyeballing these 4890s as a practical purchase.  I honestly have no preference between nvidia or ati, so long as the product makes me happy.  I don't understand arguments that are fueled by illogical brand loyalty.

Oh and don't feel bad about spending too much on a video card.  I bought an x800xl from compusa when I probably could have gotten it for much, MUCH cheaper.


----------



## a_ump (May 18, 2009)

iStink said:


> Sorry, the whole "words are hard" comment just irritated me lol.  Sorry for the name calling.
> 
> As far as bang for the buck goes, I'm the same way when it comes to what I buy.  I've loved the hell out of my 8800gt, but I think it's time to upgrade soon.  I've been seriously eyeballing these 4890s as a practical purchase.  I honestly have no preference between nvidia or ati, so long as the product makes me happy.  I don't understand arguments that are fueled by illogical brand loyalty.
> 
> Oh and don't feel bad about spending too much on a video card.  I bought an x800xl from compusa when I probably could have gotten it for much, MUCH cheaper.



exactly, that's the way i shop when it comes to computer hardware and it's the logical way imo. Though whenever someone is building a rig and ask for help if there's an Intel rig and then an AMD rig of same performance for round the same price, i'll point them towards AMD jsut cause i was AMD to do better to improve competition and keep themselves alive lol


----------



## iStink (May 18, 2009)

a_ump said:


> exactly, that's the way i shop when it comes to computer hardware and it's the logical way imo. Though whenever someone is building a rig and ask for help if there's an Intel rig and then an AMD rig of same performance for round the same price, i'll point them towards AMD jsut cause i was AMD to do better to improve competition and keep themselves alive lol



It's the same damn thing in a Mac vs PC discussion.  I mean I popped in just now to check apple insider, and there was an article talking about Microsoft's new laptop ads.  There are 256 comments posted in response to this article in just two days.  You know what they are discussing? "APPLE COSTS TOO MUCH", "PC'S ARE CRAP", "MICROSOFT IS EVIL", "APPLE ARE A BUNCH OF THIEVES", "PC'S BREAK ALL THE TIME", "MAC USERS ARE ELITESTS"

I literally just posted a simple question "Are you happy with your computer? If you are, then what does it matter what anyone else tells you?"

I see the same thing here (maybe not as heavy) about intel vs amd or nvidia vs ati.  People get so anal about mindless crap that has no effect on anything.


----------



## Kursah (May 18, 2009)

I hope the tight competition keeps up, I think it's good both ati and nv took a different route to the same solution. I wish 3dfx would return and that Intel does well too, the more competition the better off we are with getting better prices, also the better chances of everyone getting a solid performer no matter what they choose. I could care less about HD5xxx this or GT300 that till they're actually out and rendering games (and benches) for the masses. I'll read some reviews, see what forum members get for OC's, see how driver support, temps, failures go and wait and focus more on what the upper mid-range has to bring. I might not even replace my 260 if all my games still play smooth, I've been so happy with this card the last 10 months I just hope the price/performance competition keeps up in the future and maybe more contenders step in the ring.


----------



## icon1 (May 18, 2009)

I always use NVDIA cards but nevertheless I still want ATI to come up with something strong with their next gen graphic cards. Healthy competition on both side is always nice too keep
the price of the next gen card more affordable, if ATI doesn't come up with anything strong to compete w/ this beast then the price of Nvidia's GT300 will be sky high..

with the next gen cards just around the corner this is getting more exciting Lol.


----------



## a_ump (May 18, 2009)

icon1 said:


> I always use NVDIA cards but nevertheless I still want ATI to come up with something strong with their next gen graphic cards. Healthy competition on both side is always nice too keep
> the price of the next gen card more affordable, if ATI doesn't come up with anything strong to compete w/ this beast then the price of Nvidia's GT300 will be sky high..
> 
> with the next gen cards just around the corner this is getting more exciting Lol.



true, but look at it this way. we know if ATI isn't equal or almost on par with nv performance then their prices will be much lower, still driving nv down some. And also, look at the current cards, the HD 5870 according to specs should be at least 40% percent faster than the HD 4890 and what game is out that the HD 4890 can't handle? granted GTA4, crysis, and stalker are all that come to mind for me, so +30% performance even and what won't the HD 5870 handle? i kno DX11 is coming, but i'm hoping it's more of performance enhanced than visually enhanced. Honestly the next time your out in the country a lil or something check out the forestry and leaves and shit, then look at crysis. It's not far off at all from realistic appearance so if developers use DX11 for more of a streamlined DX10 and performance based then there shouldn't be a problem running games. i look for DX11 to bring physics to a more realistic level than for it to make textures and whatnot more appealing.


----------



## senninex (May 18, 2009)

*nVidia Vs ATI*

Power: Nvidia might got power chip than ATI
$$$$: ATI might advantage on this & may use x2 to fight Nvidia monster chip.
Watt: Due to ATI higher clock... it might, ATI>Nvidia
The wave: ATI  nowaday become stronger, make nvidia sick.


----------



## [I.R.A]_FBi (May 18, 2009)

senninex said:


> Power: Nvidia might got power chip than ATI
> $$$$: ATI might advantage on this & may use x2 to fight Nvidia monster chip.
> Watt: Due to ATI higher clock... it might, ATI>Nvidia
> The wave: ATI  nowaday become stronger, make nvidia sick.



good analysis


----------



## tkpenalty (May 18, 2009)

I have to admit, fanboyism of any sort, for any camp, or developing a brand loyalty as such is an entirely foolish practise which leads to no gains in the future. Joining sides with a GPU company or any company, religion, etcetera will basically mean that you will root for them, and thus have a somewhat biased judgement as to what is the truth as you will not want to say anything disadvantageous for what you go for. True if everyone thought like me AMD would have had no customers during that Phenom I era, but its advice, for your personal benefit. 

Please people, don't start a debate on GT300 vs RV580, especially here, when the products have only been taped out and not even released on paper yet. At this point there are no performance figures, and due to the non-linear increases in performance with architectural upgrades (i.e. shader count boost), you can't really expect one card to perform like you think it would. Think of other factors such as retiming of the core, etc (Like in AMD's case, 4870 to 4890). 

In the end it comes down to not theory, but how the card actually works. I find arguments such as "AMD used two cores to beat one" stupid as there is no actual "beating" when you the customer can make your own decisions to buy something, plus how they achieved the performance in the end will mean squat for you the consumer. 

I'm not really suggesting anything but a some members here need to learn how to reason, and be a bit more rational with their decisions. 

To senninex, if you havent realised the market is very unpredictable. If the whole maket went by patterns like the one you mentioned, everyone would be free of their financial woes. Note a very curious thing. GTX 295 is cheaper in australia than the HD4870X2 (well at least in sydney where prices are damn high), and the GTX260 is cheaper than the HD4890. (Ironically most people go for 4890s/4850s/4870s instead of the green camp's stuff... unpredictable.


----------



## Hayder_Master (May 18, 2009)

when i see GT300 specification now , i see it is look like 5870x2


----------



## icon1 (May 18, 2009)

None of us either know how the next series of graphic cards will really perform,
so debating on who's got the best offering is useless for now..

even though I always use NVIDIA g.cards, I still want ATI to come up with
something big.. competition makes nvidia & ati cards more affordable,
at the same it pushes both camp to develop their products even
more better.. just my 2 scents


----------



## Blacksniper87 (May 18, 2009)

fair enough point will be waiting for this excitedly regardless of what the bloody inquirer reckens read here:
http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture


----------



## Tau (May 18, 2009)

Darren said:


> Would you pay 21% more for a car that runs 5% faster? NO
> 
> So why would anyone with a brain do it for a GPU or a CPU?



Unfortunatly thats what it costs when your talking about that kind of speed.  In the car world say it costs $8,000 to build a 400HP 4 cyl, it then costs $12,000 to take that same motor to 500HP.  The costs get huge as the performance increase is minimal.




Valdez said:


> I hope it's clear, that ati doesn't want to make a big gpu to compete with nvidia. This is where ati's and nvidia's strategies differs, nvidia wants one, but a very powerful gpu, and ati wants to go multicore. So they make a small, but very efficient gpu, and release 1,2 or 4 gpu cards. 1 rv870 can't compete with gt300, but it doesn't have to. That's the x2's task.
> And we didn't even spoke about the x4 variant (if the gossips of MCM are true) .



It doesent matter how many cores it takes to compete with each other...  the only thing that matters at the end of the day is price, performance, power draw, and heat really...



Steevo said:


> How is a 1Ghz or 950Mhz product from a AIB partner any different than a AIB partner for Nvidia releasing a overclocked product?
> 
> 
> For the record 1Ghz is within spec for these chips, and not a "overclock" part.
> ...



Sure ATI has them beat in price... want to know why?

Your not buying the same kind of power.

All of this fanboy ism is really rubbish, as well as all the speculation in this thread.

There is NO use in speculating what they are going to come out with... just wait and see when it comes out, and go with what you know, as well as what works for you.

And arguing that ATI has a better price/performance ratio is kind of a moot point as it depends on how the BUYER percieves the price.  To some people who are on an extream budget a cheaper/decent performance card makes more sence...  To someone who is not on a shoestring budget and does not want to have to upgrade the card later something in the bigger price bracket would fit the bill....

I go with whats fast as for the most part i dont really care about how much it costs...  as you all complaine about $400 CPU's, $600 videocard, and $200 HDDs....  I just think back to when i bought my first CD-ROM drive for $800.  so $800 for an entire videocard that beats everything else on the market is ok in my books.


----------



## tkpenalty (May 18, 2009)

Please do not draw conclusions from pure speculation guys... you can say all you want about how a GPU will perform based off its specifications but thats like saying a car with a larger engine should be faster.


All Charlie Demerjian is, is a speculative writer who's viewpoints seem to always slam nvidia. He comes up and expands on every point, to a huge extend, with the basis being absolutely nothing, apart from pure speculation for each factor he discusses about Nvidia. He seems to think that if the same happened before, the same will happen again, and doesn't realise what matters is not how something gets there, its what the actual thing does. Its like some fanboy bitching about how AMD got the 4850 to kill a GTX280, he says something like:

_Nvidia chipmaking of late has been laughably bad. GT200 was slated for November of 2007 and came out in May or so in 2008, two quarters late. We are still waiting for the derivative parts. The shrink, GT206/GT200b is technically a no-brainer, but instead of arriving in August of 2008, it trickled out in January, 2009. The shrink of that to 40nm, the GT212/GT200c was flat out canceled, Nvidia couldn't do it._ 

AMD was even worse when it came to the HD2k series. He doesnt even mention anything about that. He doesnt even bother to mention the actual product itself. The Final product. Why would the fucking consumer give a shit about how something was developed, how long it took? Does it matter if the Larrabee is so "similar" to the GT300 when AMD also has a similar product, because if it does, then well its like saying that car companies should be ashamed for copying mercedes benz in the first place, wait no, most of the human race should be ashamed for copying the wheel off the original inventors! Bloody idiots, and biased rampant journalists like these should really step down. IF there was no "copying" we'd find that every company has a different graphics socket, with different memory chips, and finally a multitude of display outputs. All graphics cards are based off similar architectures, Cache (RAM, onboard cache), CPU (Core), input and output. I dont see other journalists slamming companies; its a standard. Yet we see charlie using a completely invalid argument that the standard affair of DX11 shader SIMD/MIMD units is "copying", and its "ironic". 

He doesnt seem to realise that the consumer is more concerned about the product being useable. You are not the average consumer if you obssess about how something got there. 
However if its something like invalidating the RoHS ratification, then by all means, please complain. Its these people who are indirect, and side with one side who cause so many problems for us.


Hope he reads this and rethinks his position.

Its idiotic bitching about the "arrogance" of a company. All it is is the image. Its not the substance. I do not give a fucking shit if the CEO has bad attitude. Why? Because we the consumer only get the END Product, judge off the END product, what you receive-note customer support count as well. See I will STILL buy intel's products despite their bad corporate profile. If you ask why I'd suggest rereading this line. Its not up to the consumer to condemn a company, unless they've caused you pain or something. Actual, tangible pain. Not "Zomfg intel uses fake cores im buying AMD" (disregards the fact that "fake core" CPUs perform better anyway). Some reason consumers prefer to go for the non-tangible assets of a product, instead of what it can actually do. I think that a lot of firms pride off this idiocracy, as well as "fanboism" one noteable exampme being gibson guitars.


----------



## Noggrin (May 18, 2009)

Blacksniper87 said:


> fair enough point will be waiting for this excitedly regardless of what the bloody inquirer reckens read here:
> http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture



Charlie aka the douchebag is back at it again..  This guy is so pathetic, it's actually fun to think for a min that he maybe truly believes the shit that is coming out of his mouth.


----------



## a_ump (May 18, 2009)

all i've heard is to ignore what that Charlie dude says on the inq. Though he makes it sound very true and argumentative, Nvidia aren't idiots. I don't see them pulling a GT300 if all those risk factors are true, and though as i said Charlie argued them very well and convincingly, it just doesn't make sense. I've already pondered the whole "ATI will be better at DX11 due to DX10.1 being very close to it" but yea nvidia isn't dumb enough to pull a flop when ATI is all over their ass. I had also contemplated the whole new speculated design behind the supposed 512 shaders. Doubling shaders doesn't happen often or easily, and with how large the die area the GT200 took to incorporate 240 shaders along with the other good stuff, moving to 512 would have to change these shaders dramatically in size(i would think) which being something that new could as Charlie said suffer performance and yields. If what charlie says is true(i dout it) then i'm very interested to see how nvidia handles it. Ah so much speculation


----------



## Blacksniper87 (May 18, 2009)

yep there is a lot of speculation which is why wanquers like Charlie need to stop pretending they know everything about nvidia's upcoming cards


----------



## yogurt_21 (May 18, 2009)

fuel for the fire

rumored specs of the 5870
http://www.hardware-infos.com/news.php?news=2908

personally I'm not interested in the highend as much as the 200-300$ range as that's where I'll be buying. if we have another bout like the 4870 vs gtx260 it's good news for teh yogurt.


----------



## WarEagleAU (May 18, 2009)

If ATI wants to truly stay ahead of what looks phenomenal on paper with the GT300, they need to unlock their 800 stream processors from the core. I keep posting this hoping someone from there will see it. 1200 - 2400 SPUs clocked around 1100mhz??? shoot, I cant even begin describing how that will kick ass.


----------



## Melvis (May 18, 2009)

Quick question is the 4870x2 sideport working yet? did ATI release drivers for this sideport to be activated and to gain more performance? I never heard if they did or not.


----------



## mdm-adph (May 18, 2009)

DrPepper said:


> The number of SP's aren't half as important as the way the shaders handle instructions. For example nvidia uses a different method to ati and thats why nvidia can use less, more powerful shaders than ATI. Whose to say ATI will continue with thier current shader model and change it so they can use 1200sp's and get the same performance as 2400. Anyway back to the point ATI engineers prefer to find cheap methods of adding performance and compared to nvidia's tactic of adding more.



Well, that's different from what I've heard.

Nvidia doesn't use "less and more powerful" shaders, but apparently on ATI cards the shaders are just calculated differently.  For instance, I've heard that ATI uses some sort of "pentagonal" method to counting its shaders -- counting each individual shader as having 5 sides or something (Nvidia doesn't do this, apparently).

So, for a more even comparison, just divide the number of shaders on an ATI card by 5.

HD 4870 = 160 shaders.
GTX 280 = 240 shaders.

Make more sense now?  The GTX 280 is faster, of course, but for what it has, the RV770 isn't bad, either.


----------



## crazy pyro (May 18, 2009)

Wah, where'd the extra two pages go?
I'm sure I read at least a pager after this post...


----------



## Blacksniper87 (May 19, 2009)

Hey so the specs are in on the taped GT300's 700 mhz core 1600 mhz shaders and 1100 mhz memory apparently with room to move


----------



## a_ump (May 19, 2009)

source? and i'm suspicious of whether nvidia would get GDDR5 to work at that frequency the first time they use it on a card. ATI only managed 900mhz(3,600mhz effective) with their first implementation of it, or was that because of the chips themselves being in effecient?


----------



## Blacksniper87 (May 19, 2009)

a_ump said:


> source? and i'm suspicious of whether nvidia would get GDDR5 to work at that frequency the first time they use it on a card. ATI only managed 900mhz(3,600mhz effective) with their first implementation of it, or was that because of the chips themselves being in effecient?




http://www.hardware-infos.com/news.p...2954&sprache=1

Just yesterday we reported, that Nvidia's upcoming high end desktop chip G300 would have successfully mastered its tape out and would currently be in A1 stepping, which is maybe already the final.
Now we can present you the clock frequencies of the samples, which Nvidia is already very satisfied with, so that they could also be the final ones.

Thus the running G300 samples in A1 stepping work with 700 MHz chip frequency, 1600 MHz shader frequency and 1100 MHz memory frequency - latter was already suggested in our previous news report.
With the help of the already revealed shader units and the memory interface width we can make first accurate quantitative comparisons.

As we already told you, the G300 will have 512 instead of 240 shader units. The approximate structure, that is still 1D shader units which can compute one MADD and MUL per frequency, seems to be kept, so that we can already conclude that the running samples will achieve a theoretic performance of believe it or not 2457 Gigaflops.
Although the simile to the G200, representing GTX 280, seems to be difficult, because the G300 does not have classical SIMD units anymore, but MIMD like units, the pure quantitative comparison shows a difference of 163 per cent in performance.

The memory bandwidth can now be considered, too, with the knowledge about the memory frequency. So Nvidia will also reach remarkable 281.6 GB/s with 1100 MHz. Just quantitatively it matches with exactly 100 per cent more memory bandwidth in comparison to the GTX 280.

Statements about the theoretic TMU and ROP performance cannot be made yet, despite of the known chip frequency, because the amount of those is not yet known or, in case of the ROPs, it is not even certain if they will still be Fixed-Function-Units.


----------



## a_ump (May 19, 2009)

i didn't really know the difference between MIMD and SIMD, but googled a lil bit and i thk i understand. MIMD has each processor handle a program or w/e, but SIMD has all of them work together to quickly finish each program, at least htat's the jist of it that i got. So it'd make sense for nvidia to increase from 240 to 512 if these cores are going to be slower with each processing data of it's own instead of all of them working together to compute that data. Is that correct?


----------

