Thursday, July 9th 2009
NVIDIA GeForce G210 and GeForce GT 220 Now Official
Earlier this week we informed you of the existance of two upcoming 40 nm NVIDIA parts, GeForce GT 220 and GeForce G210. We also gave a hypothetical release date which stated "early Q4", well we lied to you in a good way, the cards are already official and standing on NVIDIA's official page.
Both cards support DirectX 10.1, OpenGL 3.0 and CUDA, with the G210 having analog VGA, DisplayPort and DVI outputs while the GT 220 has VGA, HDMI and DVI. The GeForce G210 has 16 processor cores and a 589 MHz CPU clock speed; that's paired with 512 MB of DDR2 memory with a 64-bit interface and 500 MHz clock speed. Its shaders run at 1402 MHz. As for the NVIDIA GeForce GT 220, it has 48 processor cores and a 615 MHz clock speed, paired with 1 GB of GDDR3 memory with a 790 MHz clock and a 128-bit interface. It has a slightly slower shaders clock speed of 1335 MHz. Neither of the two cards is expected to be available directly to consumers, both offerings are marked as OEM products and meant to be entry-level options in pre-built PCs.
NVIDIA GeForce G210 (OEM Product) Specs
NVIDIA GeForce GT 220 (OEM Product) Specs
Source:
NVIDIA
Both cards support DirectX 10.1, OpenGL 3.0 and CUDA, with the G210 having analog VGA, DisplayPort and DVI outputs while the GT 220 has VGA, HDMI and DVI. The GeForce G210 has 16 processor cores and a 589 MHz CPU clock speed; that's paired with 512 MB of DDR2 memory with a 64-bit interface and 500 MHz clock speed. Its shaders run at 1402 MHz. As for the NVIDIA GeForce GT 220, it has 48 processor cores and a 615 MHz clock speed, paired with 1 GB of GDDR3 memory with a 790 MHz clock and a 128-bit interface. It has a slightly slower shaders clock speed of 1335 MHz. Neither of the two cards is expected to be available directly to consumers, both offerings are marked as OEM products and meant to be entry-level options in pre-built PCs.
NVIDIA GeForce G210 (OEM Product) Specs
NVIDIA GeForce GT 220 (OEM Product) Specs
17 Comments on NVIDIA GeForce G210 and GeForce GT 220 Now Official
at least they have HDMI and DVI even when using the low profile bracket, thats good.
EDIT: ive been thinking of buying a single slot card to do just that to for my Asus P6T deluxe, packing 48 shaders, not much heat and no need for extra power, it makes a good case for itself.
On paper the GT220 card seems to be better than an 8600GT, so it will even game well at low enough res/settings. We've seen what Nvidias ION can do in terms of low... LOW end gaming, this card paired with any modern desktop CPU is a to-the-wire budget gamers delight imo.
To clarify: one exe runs DX 11, 10.1 and 10.0 - the game merely disables any features that your card doesnt suport. MS learned from their mistakes, and they dont want another DX10 fiasco.
The developers have to code in a rendering path for DX10.1 before it can be used. Essentially, every DX11 game will have to have 4 rendering paths coded for it. DX11, DX10.1, DX10, and DX9. That is a real pain in the ass. I'm going to guess we will be lucky to see DX10.1 and DX9 actually supported in DX11 game though. I'm guess they will continue to only support DX10 and DX11.
And beyond that, who really cares about DX10.1 anyway? I can't even see the difference between DX10 and DX10.1 in the few games that actually support DX10.1.
When they make DX11 games, all this stuff is included as part of DX11.
Its like running a source game, they have a drop down for DX7, 8.1 and 9.0C.
DX10.1 is mostly speed boosts, particularly with AA. you wont "see" a difference.
The source engine has rendering paths all the way back to DX7, but not all DX9.0C game support DX7 or DX8 natively. It doesn't work like that, the developers have to add support for those versions of DX manually. With the source engine, that was pretty easy since DX8 was in use when developement was started. So they started developement based on DX8, with DX7 support, then as they developed it, DX9 came out and they added support for that. Now DX9 included everything you need to run DX8 and DX7 games, as DX9 kept the library files for DX8 and DX7, but the game still had to support using those library files.
However, with DX10, it was a whole new API that didn't natively include support for DX9 and earlier. I was under the understanding that DX11 will be like this also. It will be a completely new API that doesn't include the DX10/.1 library files.(Again, I could be wrong here.) But even if DX11 does include the DX10/.1 API, the game developers still have to manually code the game to use them.
And DX10.1 is mostly performance improvements, you're correct. However, the nVidia cards don't need the performance boosts to outperform the ATi cards that have them, so your point is kind of moot anyway...:D
The only reason the features included in 10.1 werent in 10 originally, was because nvidia couldnt do it. ATI will simply have better AA performance than nvidia, until they come out with new cards.