Thursday, July 9th 2009

NVIDIA GeForce G210 and GeForce GT 220 Now Official

Earlier this week we informed you of the existance of two upcoming 40 nm NVIDIA parts, GeForce GT 220 and GeForce G210. We also gave a hypothetical release date which stated "early Q4", well we lied to you in a good way, the cards are already official and standing on NVIDIA's official page.
Both cards support DirectX 10.1, OpenGL 3.0 and CUDA, with the G210 having analog VGA, DisplayPort and DVI outputs while the GT 220 has VGA, HDMI and DVI. The GeForce G210 has 16 processor cores and a 589 MHz CPU clock speed; that's paired with 512 MB of DDR2 memory with a 64-bit interface and 500 MHz clock speed. Its shaders run at 1402 MHz. As for the NVIDIA GeForce GT 220, it has 48 processor cores and a 615 MHz clock speed, paired with 1 GB of GDDR3 memory with a 790 MHz clock and a 128-bit interface. It has a slightly slower shaders clock speed of 1335 MHz. Neither of the two cards is expected to be available directly to consumers, both offerings are marked as OEM products and meant to be entry-level options in pre-built PCs.

NVIDIA GeForce G210 (OEM Product) Specs
NVIDIA GeForce GT 220 (OEM Product) Specs
Source: NVIDIA
Add your own comment

17 Comments on NVIDIA GeForce G210 and GeForce GT 220 Now Official

#1
newtekie1
Semi-Retired Folder
Doesn't seem like the GT220 would be that bad of an entry level card. The G210 would probably make a decent Physx card...
Posted on Reply
#2
Semi-Lobster
They look like really interesting cards, its a shame we won't be able to a proper review for them in a while since they're OEM only cards.
Posted on Reply
#3
Mussels
Freshwater Moderator
its funny that the really low performance "should be onboard VGA" cards have DX10.1, and mainstream nvidia dont.


at least they have HDMI and DVI even when using the low profile bracket, thats good.
Posted on Reply
#4
wolf
Better Than Native
Could be an interesting card to use for physx/folding and extra monitors (HDMI :rockout:), and could be chopped down to fit into a 4x pci-e slot too from what Ive seen.

EDIT: ive been thinking of buying a single slot card to do just that to for my Asus P6T deluxe, packing 48 shaders, not much heat and no need for extra power, it makes a good case for itself.
Posted on Reply
#5
btarunr
Editor & Senior Moderator
wolfCould be an interesting card to use for physx/folding and extra monitors (HDMI :rockout:), and could be chopped down to fit into a 4x pci-e slot too from what Ive seen.
x1 too.
Posted on Reply
#6
Mussels
Freshwater Moderator
i'm waiting for Nv to release a 'physics' card - GF 8200 GPU (or better, whatever works), no monitor outputs, 128MB of ram PCI-E 1x, small - purely for PhysX (or F@H, lol)
Posted on Reply
#7
wolf
Better Than Native
How many PPD would 16 sp's crunch? that would be the really lol part :P i'd say the best starting point for such a card would be 32 sp's, they alone can make a decent contribution and indeed game (8600GT)

On paper the GT220 card seems to be better than an 8600GT, so it will even game well at low enough res/settings. We've seen what Nvidias ION can do in terms of low... LOW end gaming, this card paired with any modern desktop CPU is a to-the-wire budget gamers delight imo.
Posted on Reply
#8
newtekie1
Semi-Retired Folder
Musselsits funny that the really low performance "should be onboard VGA" cards have DX10.1, and mainstream nvidia dont.
Not really surprising. When the mainstream cores were developed DX10.1 was useless, and even today it is arguably pointless to consider. The difference is pretty unnoticable, especially to the average consumer, and I couldn't even tell you which games support DX10.1. Essentially DX10.1 was a marketting gimmick.
Posted on Reply
#9
Mussels
Freshwater Moderator
newtekie1Not really surprising. When the mainstream cores were developed DX10.1 was useless, and even today it is arguably pointless to consider. The difference is pretty unnoticable, especially to the average consumer, and I couldn't even tell you which games support DX10.1. Essentially DX10.1 was a marketting gimmick.
wait til DX11 hits and the backward compatibility trickles down - i'll be enjoying my free AA :P
Posted on Reply
#10
newtekie1
Semi-Retired Folder
Musselswait til DX11 hits and the backward compatibility trickles down - i'll be enjoying my free AA :P
What backwards compatibility? The only backwards compatibility DX11 will have is the same backwards compatibility DX10 had...
Posted on Reply
#11
Mussels
Freshwater Moderator
newtekie1What backwards compatibility? The only backwards compatibility DX11 will have is the same backwards compatibility DX10 had...
DX11 games will work on DX10 and 10.1 cards, with features disabled. so i'm getting the DX10.1 features enabled, and nvidia users wont.

To clarify: one exe runs DX 11, 10.1 and 10.0 - the game merely disables any features that your card doesnt suport. MS learned from their mistakes, and they dont want another DX10 fiasco.
Posted on Reply
#12
newtekie1
Semi-Retired Folder
MusselsDX11 games will work on DX10 and 10.1 cards, with features disabled. so i'm getting the DX10.1 features enabled, and nvidia users wont.
Only if the game developers actually support DX10.1 features in the game. Not all DX11 games will support DX10.1, DX11 doesn't guarantee DX10.1. If they continue on their trend of lazyness, then DX10 is all we are going to get.

The developers have to code in a rendering path for DX10.1 before it can be used. Essentially, every DX11 game will have to have 4 rendering paths coded for it. DX11, DX10.1, DX10, and DX9. That is a real pain in the ass. I'm going to guess we will be lucky to see DX10.1 and DX9 actually supported in DX11 game though. I'm guess they will continue to only support DX10 and DX11.

And beyond that, who really cares about DX10.1 anyway? I can't even see the difference between DX10 and DX10.1 in the few games that actually support DX10.1.
Posted on Reply
#13
Mussels
Freshwater Moderator
newtekie1Only if the game developers actually support DX10.1 features in the game. Not all DX11 games will support DX10.1, DX11 doesn't guarantee DX10.1. If they continue on their trend of lazyness, then DX10 is all we are going to get.

The developers have to code in a rendering path for DX10.1 before it can be used. Essentially, every DX11 game will have to have 4 rendering paths coded for it. DX11, DX10.1, DX10, and DX9. That is a real pain in the ass. I'm going to guess we will be lucky to see DX10.1 and DX9 actually supported in DX11 game though. I'm guess they will continue to only support DX10 and DX11.

And beyond that, who really cares about DX10.1 anyway? I can't even see the difference between DX10 and DX10.1 in the few games that actually support DX10.1.
DX11 games support it from the get go, you're missing the point.

When they make DX11 games, all this stuff is included as part of DX11.
Its like running a source game, they have a drop down for DX7, 8.1 and 9.0C.

DX10.1 is mostly speed boosts, particularly with AA. you wont "see" a difference.
Posted on Reply
#14
newtekie1
Semi-Retired Folder
MusselsDX11 games support it from the get go, you're missing the point.

When they make DX11 games, all this stuff is included as part of DX11.
Its like running a source game, they have a drop down for DX7, 8.1 and 9.0C.

DX10.1 is mostly speed boosts, particularly with AA. you wont "see" a difference.
No DX11 games do not support DX10 or DX10.1 by default. The rendering paths need to be included by the developers for the game to support DX10/.1. Unless they have changed something in DX11 that makes DX11 include DX10 in the standard(and they may have, I don't know).

The source engine has rendering paths all the way back to DX7, but not all DX9.0C game support DX7 or DX8 natively. It doesn't work like that, the developers have to add support for those versions of DX manually. With the source engine, that was pretty easy since DX8 was in use when developement was started. So they started developement based on DX8, with DX7 support, then as they developed it, DX9 came out and they added support for that. Now DX9 included everything you need to run DX8 and DX7 games, as DX9 kept the library files for DX8 and DX7, but the game still had to support using those library files.

However, with DX10, it was a whole new API that didn't natively include support for DX9 and earlier. I was under the understanding that DX11 will be like this also. It will be a completely new API that doesn't include the DX10/.1 library files.(Again, I could be wrong here.) But even if DX11 does include the DX10/.1 API, the game developers still have to manually code the game to use them.

And DX10.1 is mostly performance improvements, you're correct. However, the nVidia cards don't need the performance boosts to outperform the ATi cards that have them, so your point is kind of moot anyway...:D
Posted on Reply
#15
Mussels
Freshwater Moderator
newtekie1No DX11 games do not support DX10 or DX10.1 by default. The rendering paths need to be included by the developers for the game to support DX10/.1. Unless they have changed something in DX11 that makes DX11 include DX10 in the standard(and they may have, I don't know).

Thats how it will work. MS knows that making DX11 and DX11 only games would be a killer for 7, so they've made it that DX10 cards (and 10.1) will all be able to run DX11 games out of the box.
newtekie1And DX10.1 is mostly performance improvements, you're correct. However, the nVidia cards don't need the performance boosts to outperform the ATi cards that have them, so your point is kind of moot anyway...:D
*shakes fist* lets not turn this into a red vs green fight :P
Posted on Reply
#16
newtekie1
Semi-Retired Folder
MusselsThats how it will work. MS knows that making DX11 and DX11 only games would be a killer for 7, so they've made it that DX10 cards (and 10.1) will all be able to run DX11 games out of the box.
That is pretty obvious, no game developer in their right mind would limit a game to DX11 only. I know that DX11 games will run on DX10 cards, but will they all have DX10.1 support? Has Microsoft required this? Because if they have only required DX10 support, then that is all we will get. I didn't think the DX10 API was part of DX11, or is it?
Mussels*shakes fist* lets not turn this into a red vs green fight :P
Not trying to turn it into a red vs. green, just stating a truth. You are correct that DX10.1 is mostly performance related, so it is kind of pointless to worry about if the DX10 cards are outperforming the DX10.1 card.
Posted on Reply
#17
Mussels
Freshwater Moderator
i'm confident MS has required 10.1 to be included in the 11 specs.

The only reason the features included in 10.1 werent in 10 originally, was because nvidia couldnt do it. ATI will simply have better AA performance than nvidia, until they come out with new cards.
Posted on Reply
Add your own comment
Jul 27th, 2024 12:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts