Wednesday, September 30th 2009
NVIDIA GT300 ''Fermi'' Detailed
NVIDIA's upcoming flagship graphics processor is going by a lot of codenames. While some call it the GF100, others GT300 (based on the present nomenclature), what is certain that the NVIDIA has given the architecture an internal name of "Fermi", after the Italian physicist Enrico Fermi, the inventor of the nuclear reactor. It doesn't come as a surprise, that the codename of the board itself is going to be called "reactor", according to some sources.
Based on information gathered so far about GT300/Fermi, here's what's packed into it:
Source:
Bright Side of News
Based on information gathered so far about GT300/Fermi, here's what's packed into it:
- Transistor count of over 3 billion
- Built on the 40 nm TSMC process
- 512 shader processors (which NVIDIA may refer to as "CUDA cores")
- 32 cores per core cluster
- 384-bit GDDR5 memory interface
- 1 MB L1 cache memory, 768 KB L2 unified cache memory
- Up to 6 GB of total memory, 1.5 GB can be expected for the consumer graphics variant
- Half Speed IEEE 754 Double Precision floating point
- Native support for execution of C (CUDA), C++, Fortran, support for DirectCompute 11, DirectX 11, OpenGL 3.1, and OpenCL
205 Comments on NVIDIA GT300 ''Fermi'' Detailed
Anyway, there is still no release time, so Nvidia may won't be able to make it:rolleyes:. Remember what happened with 3dfx, they were the biggest and most famous 3d vga maker, but they are could not come up enough soon with the Rampage. Maybe Reactor=Rampage:laugh:.
Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.
And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.
at least nvidia is being innovative, I think this card is going to be a monster when it comes out.
I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.
Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.
Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.
* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.
Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...
*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:
YESSSSSS!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
:ohwell:
You have as much proof of that happening as I do of it not happening. have you any proof except that it runs better on one than the other? So you have the reason and I don't, because? Enlighten me.
I know Nvidia is playing fair, because that's what the developers say. When docens of Developers say that Nvidia is approaching them to help them develop, optimize and test their games, my natural reaction is to believe them, because I don't presume that all the people lie. I don't presume that all of them take money underhand. But most importantly I know that at least one out of the 100 that form a game development team, out of the 100 developers that have developed under TWIMTBP, one at least, would have said something already if it was happening something shaddy.
I've been part of an small developer that did small java games, and I know how much it costs to optimize and make even a simple game bug free, so it doesn't surprise me a bit that games that have been optimized on one card runs better than in the one that hasn't*. That's the first thing that make people jump about TWIMTBP, so being that's what they think and have been claiming bad behavior out of that, it doesn't surprice me.
*It also happened that one game run flawlessly on one cellphone and crashed on others, while both should be able to run the same java code.
L1 and L2 caches sounds like fun though...
What it comes down to is it was a game without selective AA. Nvidia helps develop the game and gets a special feature for their hardware line when running the game. Where is the moral dillemma?
Pros on the other hand have 1000s and every reason to jump into something like this.
It's just that the subject is being brought again every 2 posts and really, it has already been explained by many members. I dont' think it's crazy to believe in the innocence of 1000s of developers (individuals), that IMO are being insulted by the people that presume guilty. TBH I get angry because of that.