Wednesday, September 30th 2009

NVIDIA GT300 ''Fermi'' Detailed

NVIDIA's upcoming flagship graphics processor is going by a lot of codenames. While some call it the GF100, others GT300 (based on the present nomenclature), what is certain that the NVIDIA has given the architecture an internal name of "Fermi", after the Italian physicist Enrico Fermi, the inventor of the nuclear reactor. It doesn't come as a surprise, that the codename of the board itself is going to be called "reactor", according to some sources.

Based on information gathered so far about GT300/Fermi, here's what's packed into it:
  • Transistor count of over 3 billion
  • Built on the 40 nm TSMC process
  • 512 shader processors (which NVIDIA may refer to as "CUDA cores")
  • 32 cores per core cluster
  • 384-bit GDDR5 memory interface
  • 1 MB L1 cache memory, 768 KB L2 unified cache memory
  • Up to 6 GB of total memory, 1.5 GB can be expected for the consumer graphics variant
  • Half Speed IEEE 754 Double Precision floating point
  • Native support for execution of C (CUDA), C++, Fortran, support for DirectCompute 11, DirectX 11, OpenGL 3.1, and OpenCL
Update: Here's an image added from the ongoing public webcast of the GPU Technology Conference, of a graphics card based on the Fermi architecture.
Source: Bright Side of News
Add your own comment

205 Comments on NVIDIA GT300 ''Fermi'' Detailed

#76
AddSub
W1zzardyou will pay 25$ for nothing? send it to funds@techpowerup.com
Situation is that bad, eh? May I recommend injecting ads into every post on TPU as a signature and banning people who use ad-blockers (like TechReport) :D
Posted on Reply
#77
happita
BingeStill not monopolizing the market, so your whining won't do anything. This is not the thread for GPU war conspiracy theories.
No monopolizing, correct, but they are involved in highly unethical business practices. And if this said rumor by Zone is true, forcing a company to not allow a game to utilize a certain DX level just because Nvidia doesn't support it is pretty f'ed up. I could understand if it was an Nvidia-only thing, but its freakin directx!! The consumer gets f'ed in the end. That is not the way to persuade customers to go and buy their product over competition. But I'm sure AMD/ATI does the same, but the light doesn't get shed on them because they're the "small guy". I WANT TRANSPARENCY ON BOTH SIDES DAMMIT!!!
Posted on Reply
#78
eidairaman1
The Exiled Airman
AddSubSituation is that bad, eh? May I recommend injecting ads into every post on TPU as a signature and banning people who use ad-blockers (like TechReport) :D
screw you dude :nutkick::slap:
Posted on Reply
#79
Benetanegia
ZoneDymoIs it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?
They were not so many people. Around 10.000 PhysX PPU were sold. And sometimes a company has to do the best for the most. Spending as much to support 10.000 cards as you do to support 100++ million GPUs makes no sense at all, it's spending twice for nothing in the big squeme of things. I'm not saying that was good, it's a pity for those who bought the card, but for those who want PhysX acceleration, they can now have if for free, by just going Nvidia in their next purchase and all those who already had a Nvidia card got it for free.
Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?
It wasn't doable in every OS and did Ati want to share QA costs? Would Ati deliver Nvidia the newer ureleased cards, so that Nvidia could test compatibility and create the drivers before they were launched? Or they would have had to wait, with problematic drivers and taking all the blames for bad functioning setups??
Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?
BS. And it has been discussed to death in other threads. Ati cards don't do that kind of AA, which was added exclusively for Nvidia, paid by Nvidia and QA'd by Nvidia. It even has the Nvidia AA label written all over it.
The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)
BS again. If Nvidia didn't want DX10.1 in that game,, it would have never have released with DX10.1 to begin with. HD3xxx had been released moths before the game launched, they already knew how they were going to perform, it just takes a travel to the local store and buying a damn card FFS! :laugh:
Posted on Reply
#80
Atom_Anti
eidairaman1since they call this card reactor, i guess it will be as hot as a Nuclear reactor under meltdown conditions, aka GF 5800.
Yeah, yeah like Chornobyl reaktor did it in 1986:wtf:.
Anyway, there is still no release time, so Nvidia may won't be able to make it:rolleyes:. Remember what happened with 3dfx, they were the biggest and most famous 3d vga maker, but they are could not come up enough soon with the Rampage. Maybe Reactor=Rampage:laugh:.
Posted on Reply
#81
newtekie1
Semi-Retired Folder
ZoneDymoRumours?
Is it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?
Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?
Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?

The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)
Ageia was on the brink of bankrupcy when nVidia bought them. If Ageia went under, were would it have left the Ageia card owners then? They should all consider themselve lucky that nVidia bought Ageia and at least continued support for some time longer than what they would have gotten if Ageia was left to die.

Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.

And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.
Posted on Reply
#82
aetneerg
I predict GTX380 card will cost between $549.99-$579.99.
Posted on Reply
#83
legends84
oh well.. stick with my recent gpu first..this thing will burn my money:ohwell:
Posted on Reply
#84
KainXS
so if its 384bit of GDDR5, then we are looking at a card with a max of 48 rops and 512 shaders

at least nvidia is being innovative, I think this card is going to be a monster when it comes out.
Posted on Reply
#85
erocker
*
newtekie1Ageia was on the brink of bankrupcy when nVidia bought them. If Ageia went under, were would it have left the Ageia card owners then? They should all consider themselve lucky that nVidia bought Ageia and at least continued support for some time longer than what they would have gotten if Ageia was left to die.

Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.

And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.
Good point with Ageia, but Ageia isn't the only company. Many of us are still burned from 3dFX. Now that sucked.

I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.

Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.
Posted on Reply
#86
Benetanegia
BenetanegiaAFAIK that means that you can just #include C for CUDA and work with c++ like you would do with any other library and same for fortran. That's very good for some programers indeed, but only works on Nvidia hardware.
I said that, but after reading information in other places thatmight be innacurate. They're saying that it can run C/C++ and Fortran code directly. That means there's no need to deal with CUDA, OpenCL or DX11 compute shaders, because you could just program something in Visual Studio and the code would just work. I don't know if that's true but it would be amazing.
Posted on Reply
#87
Easy Rhino
Linux Advocate
given nvidias past this will most likely be $100 dollars more than ATis flagship. obviously to the sane person its performance would have to justify the cost. this time around though nvidia is coming out after ati so they may indeed have to keep their prices competitive.
Posted on Reply
#88
Benetanegia
erockerGood point with Ageia, but Ageia isn't the only company. Many of us are still burned from 3dFX. Now that sucked.

I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.

Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.
Stop the bitching allready. Fact is that 10+ games have been released using UE3 and none of them had in game AA. If you wanted AA, you had to enable it in the control panel, had you Ati or had you Nvidia. With this game Nvidia asked the developer to add AA and helped developing and Quality Assurancing the feature. AMD didn't even contact the developer so should they obtain something they didn't pay for? Should the developers risk their reputation by releasing something that has not had any QA*? Or should Nvidia pay so that the developer did QA in AMD's cards? AMD is not helping developers on purpose, but they are expecting to get all the benefits, is that any moral? Not in my book. Only reason that GPUs are sold is because of games, so helping the ones that are helping you is the natural thing to do.

Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.

* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.
Posted on Reply
#89
eidairaman1
The Exiled Airman
all I can say is its hurting their sales due to the Game having the TWIMTBP badge on it not getting the expected sales from AMD users. With that badge they getpaid a little here and there. With business practices like this it makes me glad I switched to ATI back in 2002
Posted on Reply
#90
yogurt_21
happitaAnd I was wondering why I was seeing L1 and L2 caches on an upcoming video card. I thought I was going crazy hahaha.

This will be VERY interesting, if the price is right, I may skip the 5k and go GT300 :rockout:
yeah I though that was odd as well, I'm very curious to see the performance of these bad boys. the way things are going it will be while before I have the cash for a new card anyway so i might as weel wait and see what both sides have to offer.
eidairaman1all I can say is its hurting their sales due to the Game having the TWIMTBP badge on it not getting the expected sales from AMD users. With that badge they get paid a little here and there.
funny even when I had my 2900xt I barely noticed the twimtbp badge either on the case or in the loading of the game. I've yet to find a game that will not work on either sides cards and i never buy a game because it works better on one than another. I hope most of you don't either. I buy games that i like for storyline, graphics, gameplay, and replayability. other reasons make no sense to me.
Posted on Reply
#91
aj28
Prices will be competitive because they have to be, but nVidia is going to lose big due to the cost of producing this behemoth. And what is their plan for scaling? All of this talk about a high-end chip while they're completely mum about everything else. Are they going to re-use G92 again, or just elect not to compete?

Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...
Posted on Reply
#92
erocker
*
BenetanegiaStop the bitching allready. Fact is that 10+ games have been released using UE3 and none of them had in game AA. If you wanted AA, you had to enable it in the control panel, had you Ati or had you Nvidia. With this game Nvidia asked the developer to add AA and helped developing and Quality Assurancing the feature. AMD didn't even contact the developer so should they obtain something they didn't pay for? Should the developers risk their reputation by releasing something that has not had any QA*? Or should Nvidia pay so that the developer did QA in AMD's cards? AMD is not helping developers on purpose, but they are expecting to get all the benefits, is that any moral? Not in my book. Only reason that GPUs are sold is because of games, so helping the ones that are helping you is the natural thing to do.

Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.

* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.
I'm not bitching, you're bitching.:p (You are also walking thin ice with that comment. I am free to express my views on this forum like anyone else.) I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:
Posted on Reply
#93
VanguardGX
aj28Prices will be competitive because they have to be, but nVidia is going to lose big due to the cost of producing this behemoth. And what is their plan for scaling? All of this talk about a high-end chip while they're completely mum about everything else. Are they going to re-use G92 again, or just elect not to compete?

Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...
NV is just gonna recycle old G92/GT200 Parts to fill that gap!!!
Posted on Reply
#94
DaJMasta
Native fortran support!!!!!!




YESSSSSS!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!






:ohwell:
Posted on Reply
#95
Benetanegia
erockerI'm not bitching, you're bitching.:p I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.
Do you know anything of presumption of innocence? When there's no proofs of guilty the natural reaction is to presume innocence. That's what a honest non biased person thinks. And that's what any legal system is based on.

You have as much proof of that happening as I do of it not happening. have you any proof except that it runs better on one than the other? So you have the reason and I don't, because? Enlighten me.

I know Nvidia is playing fair, because that's what the developers say. When docens of Developers say that Nvidia is approaching them to help them develop, optimize and test their games, my natural reaction is to believe them, because I don't presume that all the people lie. I don't presume that all of them take money underhand. But most importantly I know that at least one out of the 100 that form a game development team, out of the 100 developers that have developed under TWIMTBP, one at least, would have said something already if it was happening something shaddy.

I've been part of an small developer that did small java games, and I know how much it costs to optimize and make even a simple game bug free, so it doesn't surprise me a bit that games that have been optimized on one card runs better than in the one that hasn't*. That's the first thing that make people jump about TWIMTBP, so being that's what they think and have been claiming bad behavior out of that, it doesn't surprice me.

*It also happened that one game run flawlessly on one cellphone and crashed on others, while both should be able to run the same java code.
Posted on Reply
#96
Mistral
Benetanegia...They're saying that it can run C/C++ and Fortran code directly. That means there's no need to deal with CUDA, OpenCL or DX11 compute shaders, because you could just program something in Visual Studio and the code would just work. I don't know if that's true but it would be amazing.
As amazingly awesome that is, save for some very rare cases I'm quite questioning the utility of this. Besides raising the transistor count (almost wrote trannies there...), what use is that to the gaming population? And I doubt even pros would have much reason to jump on it.

L1 and L2 caches sounds like fun though...
Posted on Reply
#97
DaedalusHelios
erockerI'm not bitching, you're bitching.:p (You are also walking thin ice with that comment. I am free to express my views on this forum like anyone else.) I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:
Erocker, I think his "bitching" response was directed at the portion of the community as a whole. I know you feel a little burned by the trouble you had with Physx configurations. It really did take extra time to program selective AA into the game. I think we all agree on that right? With that being said, we know that Nvidia helped fund the creation of the game from the start. So if the developer says,"Nvidia, that money helped so much in making our production budget that we will engineer a special selective AA for your hardware line". Is that morally wrong? Do you say Nvidia cannot pay for extra engineering to improve the experience of their own end users? If Nvidia said they would send ham sandwiches to everybody that bought Nvidia cards in the last year would you say that its not fair unless they send them to ATi users too?

What it comes down to is it was a game without selective AA. Nvidia helps develop the game and gets a special feature for their hardware line when running the game. Where is the moral dillemma?
Posted on Reply
#98
Benetanegia
MistralAs amazingly awesome that is, save for some very rare cases I'm quite questioning the utility of this. Besides raising the transistor count (almost wrote trannies there...), what use is that to the gaming population? And I doubt even pros would have much reason to jump on it.

L1 and L2 caches sounds like fun though...
Who says that in 2009 a GPU is a gaming only device? It has never been anyway. Nvidia has GeForce, Tesla and Cuadro brands and all of them are based on the same chip. As long as the graphics card is being competitive do you care about anything else? You shouldn't. And appart from this the ability to run C++ code can help in all kinds of applications. You have never encoded a video? Wouldn't you like to be able to do it 20x faster?

Pros on the other hand have 1000s and every reason to jump into something like this.
Posted on Reply
#99
Tatty_Two
Gone Fishing
qubitShame that bus isn't a full 512 bits wide. They've increased the bandwidth with GDDR5, yet taken some back with a narrower bus. Also, 384 bits has the consequence that the amount of RAM is that odd size like on the 8800 GTX, when it would really be best at a power of two.
GDDR5 effectively doubles the bandwidth in any case (unlike 3), there is no card on the planet that will be remoteley hampered by effectively a 768MBits throughput, really, with GDDR5 there is absolutely no need to go to the additional expense.
Posted on Reply
#100
Benetanegia
DaedalusHeliosErocker, I think his "bitching" response was directed at the portion of the community as a whole.
Yeah that's true. Sorry Erocker if it seemed directed at you.

It's just that the subject is being brought again every 2 posts and really, it has already been explained by many members. I dont' think it's crazy to believe in the innocence of 1000s of developers (individuals), that IMO are being insulted by the people that presume guilty. TBH I get angry because of that.
Posted on Reply
Add your own comment
Dec 22nd, 2024 07:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts