Sunday, May 17th 2009

NVIDIA GT300 Already Taped Out

NVIDIA's upcoming next-generation graphics processor, codenamed GT300 is on course for launch later this year. Its development seems to have crossed an important milestone, with news emerging that the company has already taped out some of the first engineering samples of the GPU, under the A1 batch. The development of the GPU is significant since it is the first high-end GPU to be designed on the 40 nm silicon process. Both NVIDIA and AMD however, are facing issues with the 40 nm manufacturing node of TSMC, the principal foundry-partner for the two. Due to this reason, the chip might be built by another foundry partner (yet to be known) the two are reaching out to. UMC could be a possibility, as it has recently announced its 40 nm node that is ready for "real, high-performance" designs.

The GT300 comes in three basic forms, which perhaps are differentiated by batch quality processing: G300 (that make it to consumer graphics, GeForce series), GT300 (that make it to high-performance computing products, Tesla series), and G200GL (that make it to professional/enterprise graphics, Quadro series). From what we know so far, the core features 512 shader processors, a revamped data processing model in the form of MIMD, and will feature a 512-bit wide GDDR5 memory interface to churn out around 256 GB/s of memory bandwidth. The GPU is compliant with DirectX 11, which makes its entry with Microsoft Windows 7 later this year, and can be found in release candidate versions of the OS already.
Source: Bright Side of News
Add your own comment

96 Comments on NVIDIA GT300 Already Taped Out

#1
alexp999
Staff
One disadvantage of being a news poster and keeping up with the news is that I want stuff months before it even comes out :banghead:

I'm guessing these will be out around christmas along with i5 and Windows 7.
Posted on Reply
#2
DrPepper
The Doctor is in the house
I wonder what ATI's comeback will be like. Also I wonder if nvidia will bring out a dual gpu card.
Posted on Reply
#3
btarunr
Editor & Senior Moderator
ATI has RV870, although nothing spectacular from its earliest specs.
alexp999One disadvantage of being a news poster and keeping up with the news is that I want stuff months before it even comes out :banghead:
Another is that sometimes we're like the "Page 3" reporter that hangs out in the city's elite social circle, but only to report on who spilled his drink, or hung out with whom. :)
Posted on Reply
#4
KainXS
It looks like ATI might take nvidia's performance crown based on those specs
Posted on Reply
#5
qwerty_lesh
Whatever high end model this becomes, I want one, I want it made by Zotac and in my PC :D
Posted on Reply
#6
DrPepper
The Doctor is in the house
KainXSIt looks like ATI might take nvidia's performance crown based on those specs
What is ATI's specs ?
Posted on Reply
#7
PlanetCyborg
oh poor ATI:ohwell:!! i dont think that the RV870 can beat that monster!
Posted on Reply
#8
KainXS
32 rops
around 2100 shaders
GDDR5

thats what the rumors say but based on ati's previous releases its more than likely real

but if I had to guess I would say at least 2400 shaders
Posted on Reply
#9
PlanetCyborg
KainXS32 rops
around 2100 shaders
GDDR5

thats what the rumors say but based on ati's previous releases its more than likely real

but if I had to guess I would say at least 2400 shaders
that is problay the 5870x2 not the 5870!:(
Posted on Reply
#10
alexp999
Staff
Based on current architecture comparisons, ATI need about 3.5-4 times as many shaders to match nvidias performance. So around 2,000 SP on the 5 series would seem plausible assuming a similar architecture is used.
Posted on Reply
#11
Howard
stop posting news again, just release the product!
my future rig is waiting for this beast!
eVGA X58 759 + Dominator GT 2000Mhz + this beast = invincible!!! (at least 1-2years) lol
Posted on Reply
#12
happita
PlanetCyborgthat is problay the 5870x2 not the 5870!:(
I was gonna say the same, but I really do hope they rework the 5 series from the ground up. Haven't they had the same architecture since the 2k series? For ex. 3k is just a die shrink among other things, and 4k is another speed increase, no major changes with the exception of the 4890 being ridiculously high-binned at these 1GHz speeds.

But, nvidia seems to have another 8800-like series of video cards out for the next round...if thats the case, I'm not missing out this time, I may switch.
Posted on Reply
#13
theorw
Despite of being an ati user all along, i have to say that 256 gbs plus 512 st proc will give ati A HAAAAAAAAAARD time!!!
Posted on Reply
#14
Cheeseball
Not a Potato
3K wasn't just a die shrink. An OC'd HD 3850 or standard HD 3870 can destroy a HD 2900XT (the flagship of 3K).

512-bit GDDR5 is sexy.
Posted on Reply
#15
mtosev
possibly my next card with my i7 system that im getting soon :D
Posted on Reply
#16
Unregistered
happitaBut, nvidia seems to have another 8800-like series of video cards out for the next round...if thats the case, I'm not missing out this time, I may switch.
I concur. I am tired of looking at the the number two spot the vast majority of the time. If ATI can't beat the green team this time, and by a serious margin, then I am going to stuff NV/Intel into my spider case.
Posted on Edit | Reply
#17
lemonadesoda
It is getting increasingly difficult to understand how performance will scale only counting the "number of shaders". As architecture moves from SISD to SIMD to MIMD the ability to predict how this will impact CAD, or DX9, 10 or 11 rendering is very hard, esp. as you layer shader effects like FSAA etc.

It may be that there is a bigger win in resolution, ie. can 2560x1600 or a bigger win in shader effects. ie 16xFSAA etc. Only benchmarking will tell.

I wonder with 3 versions of the GPU whether this will impact CUDA abilities. If it does, ie different CUDA capabilities on each, then this will spell disaster for standardising CUDA (and Physx on CUDA) enhancements.

Looking forward to more news...
Posted on Reply
#18
toyo
Let's hope ATI/AMD will be up to the task or this cards will be 500$+ (if not more - not talking about the quadro/tesla)
Posted on Reply
#19
lemonadesoda
nV has got to find some way to claw back all the money they lost in the last 18 months due to failed laptop GPUs... and problems with their insurance providers.

To get that money clawed back, expect the G300 to be WHOOPASS, but also expect a very high premium card, at least as pricey as a GTX 295
Posted on Reply
#20
icon1
oh, this card will give ATI some headache..
Posted on Reply
#21
crazy pyro
As long as the midrange is around the hundred quid mark I don't mind, high end will always be horrifically highly priced.
Posted on Reply
#22
a_ump
i guess nvidia decided to step it up this time and kill ATI, they were surprised by HD 4XXX series so the way i see it they're doing all they can to beat ATI back down instead of ATI being on NV's ass. I wonder though, is it possible that there be too many shaders? that all of them can't be utilized kinda like a quad core is barely utilized in games?
Posted on Reply
#23
Animalpak
No way for ati to beat nvidia in performance forget this !!

Nvidia has THE DRIVERS...

ATi has nothing compared to nvidia in terms of drivers optimization.
Posted on Reply
#24
a_ump
AnimalpakNo way for ati to beat nvidia in performance forget this !!

Nvidia has THE DRIVERS...

ATi has nothing compared to nvidia in terms of drivers optimization.
?dude where have u been? ATI for the past year has been all over nvidia's ass, they had the performance crown for a while with the HD 4870x2, and yea nvidia does have the most powerful single core graphic card but either way it still took them 2 gpu's on a card to get it back. And yea nvidia does tend to have better drivers at launch than ATI but they also have a lot more access to games(TWIMTBP).

Drivers are important but with this release i thk we'll see the past repeat itself. Not like 2900XT vs 8800GTX, but more like 9800GTX vs HD 3870. The GTX 285 is one hell of a card with 240 SPU's 512-bit GDDR3. GT300 is going to have more than twice as many SPU's, and twice the bandwidth by moving to GDDR5. I just can't see the performance from the chip not destroying ATI's RV870 which i estimate by the specs, will only perform 25-35% faster than HD 4870.
Posted on Reply
#25
R_1
The new HD5870 will be like 2xHD4770 in crossfire on single die , but equipped with 265bit memory bus and faster clock. So G300 must be pretty impressive to beat this little beast.
Posted on Reply
Add your own comment
Dec 22nd, 2024 11:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts