Tuesday, September 27th 2011

ZOTAC Announces GeForce GT 520 in PCI and PCIe x1 Interface Variants

ZOTAC International, a leading innovator and channel manufacturer, today expands the value GeForce GT 520 lineup with new PCI and PCI Express x1 form factors for users with pre-built systems that have limited expansion capabilities. The new ZOTAC GeForce GT 520 PCI and PCI Express x1 graphics cards breathe new life into older systems by delivering a performance punch and new video capabilities.

"Upgrading your graphics card is the easiest way to boost your system performance and gain new capabilities. The new ZOTAC GeForce GT 520 PCI and PCI Express x1 graphics cards shows that you can experience good graphics without upgrading the rest of your system," said Carsten Berger, marketing director, ZOTAC International.
The ZOTAC GeForce GT 520 PCI and PCI Express x1 graphics cards provide DVI, HDMI and VGA outputs with dual simultaneous independent display support for an instant dual-monitor upgrade. The low-profile form factor enables the ZOTAC GeForce GT 520 PCI and PCI express x1 graphics card to easily fit in compact pre-built systems with height limitations in addition to expansion limitations.

It's time to play with the ZOTAC GeForce GT 520 PCI and PCI Express x1 graphics card.
Add your own comment

36 Comments on ZOTAC Announces GeForce GT 520 in PCI and PCIe x1 Interface Variants

#1
Disparia
Nice.

I've used x1/PCI cards in servers without onboard video before, leaving the valuable x16/x8 slots available for RAID controllers, NICs, etc. A niche benefit, but a benefit none the less.

Also nice as a secondary card. Better compatibility with other newer series cards than with the ancient 6000/7000/8000 x1 and PCI cards currently available.
Posted on Reply
#2
Frick
Fishfaced Nincompoop
Would be cool with a review of the PCI card.
Posted on Reply
#3
RejZoR
Nice. The PCI one would be cool for my cousin that has an old HP workstation. Low profile card and PCI. It's all about the price now...
Posted on Reply
#4
newtekie1
Semi-Retired Folder
The PCI version will be useful, but the PCI-E x1 version will probably be way overpriced for something I can make myself with a normal card and a dremel.
Posted on Reply
#5
Shihab
PCI...
Anyone else feeling nostalgic ?
Posted on Reply
#6
Apollo565
They are both great ideas...especially the PCI!
The last Nvidia PCI card was Geforce 6200 if i'm correct. An update is certainly due.
Posted on Reply
#7
Drone
That's nice. This is a good news for people with weak machines.
Posted on Reply
#8
Yukikaze
On. Nice. I need about three :)
Posted on Reply
#9
JATownes
The Lurker
Would this card do physx? If so, that would be a nice little addition to my rig, just for fun.
Posted on Reply
#10
_JP_
JATownesWould this card do physx? If so, that would be a nice little addition to my rig, just for fun.
It doesn't. See the last picture (table).
Posted on Reply
#11
qubit
Overclocked quantum bit
Having a x1 PCI-E interface is interesting and nichily useful. I'm just wondering if the x1 card can possibly have any worse performance than standard x16 one?

Also, why put the passive heatsink on the x16 version, but an active, whiny one on the x1 one? Makes no sense to me, should be the other way round, if anything.
Posted on Reply
#12
Sinzia
qubitHaving a x1 PCI-E interface is interesting and nichily useful. I'm just wondering if the x1 card can possibly have any worse performance than standard x16 one?

Also, why put the passive heatsink on the x16 version, but an active, whiny one on the x1 one? Makes no sense to me, should be the other way round, if anything.
I ran a 9800GT in an open-ended x1 slot, never had issues for browsing the web and video playback.
Posted on Reply
#13
_JP_
qubitAlso, why put the passive heatsink on the x16 version, but an active, whiny one on the x1 one? Makes no sense to me, should be the other way round, if anything.
I'm wondering the exact same thing, because my x1650PRO had a cooler just like that one and it was noisy. Anyway that card is for PCI slots, not PCI-e x16.
Posted on Reply
#14
qubit
Overclocked quantum bit
_JP_I'm wondering the exact same thing, because my x1650PRO had a cooler just like that one and it was noisy. Anyway that card is for PCI slots, not PCI-e x16.
You're right, it is PCI. I'm so used to seeing PCI-E that I missed it. :laugh: I'm surprised they're still supporting it.
Posted on Reply
#16
[H]@RD5TUFF
I own a 520, and it's a pretty stout little card for physx, I am eyeing that passive one for sure perhaps as a Christmas upgrade.:D
Posted on Reply
#17
n-ster
[H]@RD5TUFFI own a 520, and it's a pretty stout little card for physx, I am eyeing that passive one for sure perhaps as a Christmas upgrade.:D
This version of the card doesn't seem to have PhysX :(
Posted on Reply
#18
Disparia
I bet it's just omitted from the spec sheet.

[H]@RD5TUFF is using one, TPU has a GT 520 review stating it's supported, and the only requirements that nVidia give are:
The minimum requirement to support GPU-accelerated PhysX is a GeForce 8-series or later GPU with a minimum of 32 cores and a minimum of 256MB dedicated graphics memory. However, each PhysX application has its own GPU and memory recommendations. In general, 512MB of graphics memory is recommended unless you have a GPU that is dedicated to PhysX.
I can't see the manufacturer having control over PhysX (or CUDA) support.
Posted on Reply
#19
Shihab
JizzlerI bet it's just omitted from the spec sheet.

[H]@RD5TUFF is using one, TPU has a GT 520 review stating it's supported, and the only requirements that nVidia give are:



I can't see the manufacturer having control over PhysX (or CUDA) support.
Maybe they found a way to disable it ? I wouldn't want a Passive cooled card or one that runs on a very limited bandwidth to run Physx calculations. It would be counter productive IMO. Better wait for some benches to see if it does though.
Posted on Reply
#20
Disparia
Then they found a way to alienate me as a customer for life :)

More seriously...

I can't see performance being affected all that much by the bandwidth provided:

physxinfo.com/news/880/dedicated-physx-gpu-perfomance-dependence-on-pci-e-bandwidth/

If you have a truly passive system, then yes, this may be a concern. I don't, as the intake fan blows across all my peripheral cards and I've been meaning to purchase the rear-extractor for my Lian-Li case. Give a bit of overclock room for moar physics! I'd wager most of our systems have some level of airflow.

Most seriously...

I would just go ahead and buy this card, knowing that there is a chance of PhysX being disabled and I don't endorse for it for anyone else. End: Disclaimer.
Posted on Reply
#21
TRWOV
I also don't think that they would disable Phyxs support and even if they did it shouldn't take more than a bios flash to get it back. I mean, these GT520s must be using the same GPUs as the PCIe 16x variants, wouldn't they?
Posted on Reply
#22
dj-electric
It’s time to play with the ZOTAC GeForce GT 520 PCI and PCI Express x1 graphics card.
Hell yeah, battlefield 3 all the freaking way
Posted on Reply
#23
arnoo1
i like to see more low end graphics cards with pci-e 1x interface, so if my gpu stops for any reason i can always use the one with pci-e 1x
Posted on Reply
#24
_JP_
JizzlerI bet it's just omitted from the spec sheet.

[H]@RD5TUFF is using one, TPU has a GT 520 review stating it's supported, and the only requirements that nVidia give are:

I can't see the manufacturer having control over PhysX (or CUDA) support.
Uhm, you are right. My bad then.
newtekie1The PCI version will be useful, but the PCI-E x1 version will probably be way overpriced for something I can make myself with a normal card and a dremel.
Agreed. I think Asus has one that is passive and cheap enough.
Posted on Reply
#25
newtekie1
Semi-Retired Folder
_JP_It doesn't. See the last picture (table).
I don't see where it says it doesn't support PhysX. AFAIK, any card that supports CUDA and has 32 Shaders/CUDA Cores, will support PhysX. Now these cards only having 48 might mean that PhysX isn't supported in future PhysX releases if nVidia decides to increase the requirements again, but as it stands right now these should work with PhysX just fine.
qubitHaving a x1 PCI-E interface is interesting and nichily useful. I'm just wondering if the x1 card can possibly have any worse performance than standard x16 one?
On a card this low end, probably not noticeably.
_JP_Also, why put the passive heatsink on the x16 version, but an active, whiny one on the x1 one? Makes no sense to me, should be the other way round, if anything.
The passive heatsink makes the card too large to be considered a low profile card, and will interfere with some slimeline cases. Plus I'd take a fan ensuring the card stays cool over passive relying on the airflow in a tiny slimline case anyday.:D
Posted on Reply
Add your own comment
May 3rd, 2024 08:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts