# Radeon and GeForce Share Work, PhysX Applications Win



## btarunr (Aug 28, 2008)

The functionality of CUDA and its implementation of GPU-accelerated PhysX processing has benefited many a GeForce user. Users of ATI accelerators lacking this incentive either use Ageia PhysX card or avoid it altogether. It has been verified by Hardspell that in an environment where Radeon accelerator(s) do graphics processing, a GeForce accelerator can be used standalone to process PhysX. Hardspell used a Radeon HD 3850 along with a GeForce 9600 GT on the same system with the display connected to the Radeon, though no form of multi-GPU graphics connection existed, the GeForce card partnered the Radeon well in processing physics, while the Radeon did graphics. Results of the oZone 3D FluidMark, a benchmark that includes routines to evaluate the machine's capability in processing physics, showed a greater than 350% increase in scores, showing that the GeForce accelerator is doing its job.



 




This was further proved with game testing of Unreal Tournament III. Provided are screen-shots from the game along with those of the FluidMark windows. The first window shows a score of 759 o3marks, while the second window in which GeForce processed PhysX, the score jumped to 2909 o3marks.



 

 

 



*View at TechPowerUp Main Site*


----------



## warup89 (Aug 28, 2008)

Wow an Nvidia card and a ATI card working together side-by-side, this is revolutionary


----------



## Deleted member 24505 (Aug 28, 2008)

Very nice,pity you cant use a radeon and gforce card together in vista.


----------



## alexp999 (Aug 28, 2008)

Well this is good use for radeon owners. i expected the drivers to conflict with each other, something wrotten.
Still begs the question whether or not it is needed. I have considered myself getting a cheap 8 series gfx card for physics, but so few games use them, it only seems to be benchmarks that truely benefit.


----------



## Basard (Aug 28, 2008)

so, would a 9500gt work too? or do we have to pay 150 bucks still just for physics?


----------



## MrMilli (Aug 28, 2008)

Basard, even a 9400GT would work!


----------



## kenkickr (Aug 28, 2008)

So could I throw a 9400GT in with my 3870 CF setup to get some PhysX support?  I guess that would be cool!


----------



## Waldoinsc (Aug 28, 2008)

this is great news...but we need some more details...what motherboards are applicable, drivers needed, which video cards can be used (can older Radeons be used, or lower capacity Geforce 9x00 series), etc.

I've been waiting for this as I have mostly used ATI/AMD graphics, but pondered how to implement physics going forward.  Can't wait to see how this unfolds.


----------



## chron (Aug 28, 2008)

Now one has to ponder - on a dual pci-e slot board, is it better to go crossfire and take up that second slot with an equally priced video card as your main, or is it better to take a cheap nvidia card and get physx capability.... 

hmm.


----------



## wolf2009 (Aug 28, 2008)

Hopefully someone will figure out a way to do this on Vista. Otherwise Vista is going to be bashed again .


----------



## Tatty_One (Aug 28, 2008)

This is gonna hurt Microsux......more people will just put off using Vista......there are an aweful lot of gamers out there in the Big Wild world.....but noone on a multi GPU Physx setup is going to be playing DX10 Physx games once more of them are made


----------



## Tatty_One (Aug 28, 2008)

chron said:


> Now one has to ponder - on a dual pci-e slot board, is it better to go crossfire and take up that second slot with an equally priced video card as your main, or is it better to take a cheap nvidia card and get physx capability....
> 
> hmm.



Just buy a 200 series card, a 2nd card does not add anything to Physx as the 200 series has CUDA 3.0.


----------



## Bytor (Aug 28, 2008)

One question....  If this only works in XP can you still run a pair of ATI cards in xfire and have a 3rd card, the nvidia running PhysX?

I think XP only supports 2 GPU's.  Thats the main reason I went to Vista so I could run the 3 3870's together.


----------



## wolf2009 (Aug 28, 2008)

Bytor said:


> One question....  If this only works in XP can you still run a pair of ATI cards in xfire and have a 3rd card, the nvidia running PhysX?
> 
> I think XP only supports 2 GPU's.  Thats the main reason I went to Vista so I could run the 3 3870's together.



I think so too . 



\      /
  \   /
   \ /
   / \
  /   \ 
 /     \ 



We are at Crossroads now . TO go with Xp and PhysX or Vista and xfire .


----------



## Bytor (Aug 28, 2008)

I have a Asus P1 PhysX card in the closet, but not sure it will work in the same way...


----------



## btarunr (Aug 28, 2008)

Basard said:


> so, would a 9500gt work too? or do we have to pay 150 bucks still just for physics?



Any card 8400 GS upwards.


----------



## alexp999 (Aug 28, 2008)

wolf2009 said:


> I think so too .
> 
> 
> 
> ...



To me thats pretty simple. Vista and Xfire. Think of it this way:

How many games will benefit from Xfire, vs PhysX...?


----------



## wolf2009 (Aug 28, 2008)

alexp999 said:


> To me thats pretty simple. Vista and Xfire. Think of it this way:
> 
> How many games will benefit from Xfire, vs PhysX...?



In a year you may get your answer if the Nvidia is to be believed. They said, 25 games before Christmas, 30 in first half next year .

Thing is PhysX adds extra dimension to the game, while all xfire does is give you good framerates .


----------



## Bytor (Aug 28, 2008)

alexp999 said:


> To me thats pretty simple. Vista and Xfire. Think of it this way:
> 
> How many games will benefit from Xfire, vs PhysX...?



Yes it is a no brainer, xfire ftw.   But what a neat idea....


----------



## alexp999 (Aug 28, 2008)

wolf2009 said:


> In a year you may get your answer if the Nvidia is to be believed. They said, 25 games before Christmas, 30 in first half next year .
> 
> Thing is PhysX adds extra dimension to the game, while all xfire does is give you good framerates .



Physx is just a standard. Any game can do its own physics, its just cheaper to use an engine and library which has already been done for you. I see more games using the haok engine, which is hardware independant.


----------



## Tatty_One (Aug 28, 2008)

wolf2009 said:


> I think so too .
> 
> 
> 
> ...



Dual boot FTW!!!


----------



## wolf2009 (Aug 28, 2008)

alexp999 said:


> Physx is just a standard. Any game can do its own physics, its just cheaper to use an engine and library which has already been done for you. I see more games using the haok engine, which is hardware independant.



maybe , but I dont see software physics doing what physX can do . Like cause the cloth in GRAW to shred to individual pieces,  you can shoot individual planks in a fence and take cover . Adds an extra dimension to the game .

look at the videos here 

http://www.driverheaven.net/articles.php?articleid=122&pageid=5


----------



## newtekie1 (Aug 28, 2008)

I really hope they get this worked out in Vista.



Bytor said:


> One question....  If this only works in XP can you still run a pair of ATI cards in xfire and have a 3rd card, the nvidia running PhysX?
> 
> I think XP only supports 2 GPU's.  Thats the main reason I went to Vista so I could run the 3 3870's together.



Yes, you can still run a pair of ATi cards and a 3rd for PhysX in XP.  CrossfireX is not supported in XP, however having more than 2 graphics cards is.  ATi just doesn't want to support CrossfireX on XP for some strange reasons, probably to much of a hassle to work out the drivers.

XP supports at least 3 graphics cards.


----------



## chron (Aug 28, 2008)

Tatty_One said:


> Just buy a 200 series card, a 2nd card does not add anything to Physx as the 200 series has CUDA 3.0.



You misunderstood me. Say someone has a 4850 and they are trying to decide to either go crossfire (buying a card of equal price as the one they have) or save some money and get a cheap nvidia card.  Would they see better performance with the cheaper nvidia card, or with the equally priced 4850?

Some people are on a budget and can't "just buy a 200 series card" lol.


----------



## R_1 (Aug 28, 2008)

It is better just to buy a quad core CPU and to use it's additional 2 cores for physics in games. In this way you will have more balanced PC.


----------



## DarkMatter (Aug 28, 2008)

chron said:


> You misunderstood me. Say someone has a 4850 and they are trying to decide to either go crossfire (buying a card of equal price as the one they have) or save some money and get a cheap nvidia card.  Would they see better performance with the cheaper nvidia card, or with the equally priced 4850?
> 
> Some people are on a budget and can't "just buy a 200 series card" lol.



It will depend on the game, obviously. If the game uses PhysX and you have the hardware physics enabled a cheap Geforce will give you a lot better performance. Probably you wouldn't be able to enable hardware physics without a Geforce or Ageia ppu, there's the possibility that some few games wouldn't even work, so in reality it's either more frames with no enhanced gameplay, or less frames but outstanding physics. Choose what you prefer.



R_1 said:


> It is better just to buy a quad core CPU and to use it's additional 2 cores for physics in games. In this way you will have more balanced PC.



The whole point of this is to have a lot better physics than what an entire (4 cores) Overclocked Quad can handle. Even the 8400 GS has probably more number crunching power than the fastest quad.


----------



## Bytor (Aug 28, 2008)

newtekie1 said:


> I really hope they get this worked out in Vista.
> 
> 
> 
> ...



Ah I was thinking that after I posted.   Only 2 of the cards connected (ATI's via crossfire) and the 3rd card(Nvidia) would be a stand alone to do PhysX...hhhmmmmm

But does anyone know of a way to make a Agiea PhysX card work with ATI video cards in the games that support PhysX?
I tried running fluidmark with this setup but it would not work.  (I had the PhysX card drivers installed)


----------



## Tatty_One (Aug 28, 2008)

R_1 said:


> It is better just to buy a quad core CPU and to use it's additional 2 cores for physics in games. In this way you will have more balanced PC.



You wont/cant use cores of the CPU unless the game is programmed in that way, apart from that, a GPU is MUCH more efficient at Px than a CPU.


----------



## DarkMatter (Aug 28, 2008)

Tatty_One said:


> You wont/cant use cores of the CPU unless the game is programmed in that way, apart from that, *a GPU is MUCH more efficient at Px than a CPU*.



Well TBH and to honor the truth, a CPU is "much" more efficient at physics (almost everything except graphics, indeed), but it's just overwhelmed by the power of the GPUs.

But yeah, because of that, a GPU can do physics a lot faster or bigger = better.


----------



## Deleted member 24505 (Aug 28, 2008)

If it worked on vista,would i be able to run 2x4850's in xfire and an nvidia card in the third pci-e x4 slot for physx?


----------



## alexp999 (Aug 28, 2008)

tigger69 said:


> If it worked on vista,would i be able to run 2x4850's in xfire and an nvidia card in the third pci-e x4 slot for physx?



Yep!

I dont get it, why doesnt it work in Vista anyway?


----------



## Deleted member 24505 (Aug 28, 2008)

I think you cant run nvidia and ati drivers at the same time on vista.


----------



## wolf2009 (Aug 28, 2008)

alexp999 said:


> Yep!
> 
> I dont get it, why doesnt it work in Vista anyway?



Its the way Vista Driver implementation is designed . It is designed to prevent drivers from getting entangled with each other and causing BSOD's. So they are not run in Vista kernel ( Something like this ). Thus it allows only one display driver. 

Due to this driver implementation, you see less BSOD's on vista.


----------



## alexp999 (Aug 28, 2008)

wolf2009 said:


> Its the way Vista Driver implementation is designed . It is designed to prevent drivers from getting entangled with each other and causing BSOD's. So they are not run in Vista kernel ( Something like this ). Thus it allows only one display driver.
> 
> Due to this driver implementation, you see less BSOD's on vista.



And I suppose the Ageia physx card doesnt count as a display driver?

i guess the next thing then, is to allow you to have a physx driver install, that allows the card only to be used for that purpose so their is still only one _display_ driver


----------



## wolf2009 (Aug 28, 2008)

alexp999 said:


> And I suppose the Ageia physx card doesnt count as a display driver?
> 
> i guess the next thing then, is to allow you to have a physx driver install, that allows the card only to be used for that purpose so their is still only one _display_ driver



yup, for using nvidia card as a physX card, you have to install their control panel with drivers .so it doesn't work .


----------



## DarkMatter (Aug 28, 2008)

I was not aware that you could only have one display driver in Vista. I have had 2 different cards many times, one for gaming and a professional one for working and I had no problems with them. This is impossible in Vista, I guess. Too bad M$...


----------



## Fitseries3 (Aug 28, 2008)

i dont see why this wont work on vista. it's gotta be a simple fix that should be fixed soon enough. vista wont let you run 2 display drivers correct, BUT the cuda/physx driver is not really a display driver. there has got to be another reason why it's not working yet. 

as for XP and multi gpu support.... i had my 2x3870x2's working fine in XP. even gpu-z showed 4gpu CFX enabled. 

i think it's just a matter of time before this gets fixed for vista.


----------



## suraswami (Aug 28, 2008)

btarunr said:


> Any card 8400 GS upwards.



how about using built-in mobo graphics like GF8200 (my ECS 8200 board) with a ATI 4850 or 3870 card? Or even with my X800GTO?


----------



## Tatty_One (Aug 28, 2008)

suraswami said:


> how about using built-in mobo graphics like GF8200 (my ECS 8200 board) with a ATI 4850 or 3870 card? Or even with my X800GTO?



Should be OK yes.


----------



## EnglishLion (Aug 28, 2008)

So what's the chance of someone writing 3rd party driver for nvidia gpu to support physics only that's not a display driver so that it's vista compatible.

I have three slots and a cheap low end nvidia card in my 3rd slot would be nice!


----------



## Bytor (Aug 28, 2008)

I installed my Asus P1 PhysX card with my ATI 3870 in my intel rig.  I installed the card drivers and when I tried running fluidmark it said I needed the 8.07.18 or better drivers, so I installed them.  Now when I try and run fluidmark the following message pops up.







The card is working fine.  I can reset it and play PhysX based games that came with the card, but can't run fluidmark..

Anyone have any ideas?

Thanks


----------



## newtekie1 (Aug 28, 2008)

Try installing the 8.08.18 PhsyX software.  You might have to uninstall all previous Ageia and PhsyX drivers first.



EnglishLion said:


> So what's the chance of someone writing 3rd party driver for nvidia gpu to support physics only that's not a display driver so that it's vista compatible.
> 
> I have three slots and a cheap low end nvidia card in my 3rd slot would be nice!



I assume that is what is going to have to happen.  NVidia is going to have to just release a set of drivers that makes Vista pick the card up as a PPU only, and not a GPU.


----------



## cdawall (Aug 28, 2008)

newtekie1 said:


> I assume that is what is going to have to happen.  NVidia is going to have to just release a set of drivers that makes Vista pick the card up as a PPU only, and not a GPU.




thats so not going to happen lol that would be to nice of NV


----------



## Tatty_One (Aug 28, 2008)

Bytor said:


> I installed my Asus P1 PhysX card with my ATI 3870 in my intel rig.  I installed the card drivers and when I tried running fluidmark it said I needed the 8.07.18 or better drivers, so I installed them.  Now when I try and run fluidmark the following message pops up.
> 
> 
> 
> ...



I think Fluidmark is an NVidia bench, thus it requires CUDA and ATi aint got that.


----------



## EnglishLion (Aug 28, 2008)

cdawall said:


> thats so not going to happen lol that would be to nice of NV



Certainly Nvidia won't be writing it, they want you to buy nvidia only obviously! But an independant programmer out there might.  I know you can get alternative ATI display drivers (don't know if they work in vista) so it might be possible.


----------



## Bytor (Aug 28, 2008)

Tatty_One said:


> I think Fluidmark is an NVidia bench, thus it requires CUDA and ATi aint got that.



In the first post in this thread the bottom pic's show they ran it using a ATI 3800 series card.


----------



## JrRacinFan (Aug 29, 2008)

http://forums.techpowerup.com/showthread.php?t=69658


----------



## Tatty_One (Aug 29, 2008)

Bytor said:


> In the first post in this thread the bottom pic's show they ran it using a ATI 3800 series card.



Lol yes....with an NVidia card so it has CUDA!!  You are not talking about using an NVidia card  hence no CUDA, you can still have Physx with the Ageia but that does not give you CUDA and therefore you have problems with this NVidia bench......get my drift?


----------



## Bytor (Aug 29, 2008)

Good point tatty....Cheers...

Not worth spending money on Nvidia at this time..


----------



## newtekie1 (Aug 29, 2008)

Tatty_One said:


> I think Fluidmark is an NVidia bench, thus it requires CUDA and ATi aint got that.





Bytor said:


> In the first post in this thread the bottom pic's show they ran it using a ATI 3800 series card.





Tatty_One said:


> Lol yes....with an NVidia card so it has CUDA!!  You are not talking about using an NVidia card  hence no CUDA, you can still have Physx with the Ageia but that does not give you CUDA and therefore you have problems with this NVidia bench......get my drift?





Bytor said:


> Good point tatty....Cheers...
> 
> Not worth spending money on Nvidia at this time..



Go read Fluidmark's page here.

An nVidia card is not required, you just need the CUDA dll if you don't have an nVidia card.  The benchmark will run fine on an Ageia card, or with no PhysX capable card at all.  It is a PhsyX benchmark, not an nVidia benchmark, just like Vantage.



cdawall said:


> thats so not going to happen lol that would be to nice of NV





EnglishLion said:


> Certainly Nvidia won't be writing it, they want you to buy nvidia only obviously! But an independant programmer out there might.  I know you can get alternative ATI display drivers (don't know if they work in vista) so it might be possible.



NVidia wants the technoligy to catch on.  The only way it will compete with Havok is if it can be used on virtually every platform, and nVidia knows that.  Which is exactly why they are helping get PhysX running on ATi cards, something not even ATi is willing to help with.

Maybe that will be their solution, if it just runs on ATi cards, then you won't have to worry about getting nVidia cards working with ATi cards in Vista, you can have all ATi cards.


----------



## Bytor (Aug 29, 2008)

Thanks Newtekie1.   Works great now...


----------



## Mussels (Aug 29, 2008)

R_1 said:


> It is better just to buy a quad core CPU and to use it's additional 2 cores for physics in games. In this way you will have more balanced PC.



or not. please at least educate yourself before making blind statements.

I own two quad core systems, and my FPS went from 15-20 in UT3 with physx, to over 70 with the video cards assisting. CPU's have nothing in terms of power, compared to video cards.


----------



## eidairaman1 (Aug 29, 2008)

ya and Intel thinks X86 is going to be a Good Graphics Engine (Larrabee) this just proves intel is further wrong.


----------



## Wile E (Aug 29, 2008)

What if, in vista, you install the entire nVidia driver package with the Physx app, then just uninstall the gfx drivers and leave Physx on there? Wonder if that would work?


----------



## kaneda (Aug 29, 2008)

CUDA is great and all, props to nVidia getting it out there. but where the damned hell is opencl?


----------



## Tatty_One (Aug 29, 2008)

Thanks Newtekie, what is interesting is that with an ATI card it runs in purely software mode through the CPU as opposed to hardware thru the GPU, but I gather thats just for the bench, you cant actually run GRAW2 in software?

It will be interesting to see how the other cards do with the bench, I am just about to take over the bench's thread so am looking forward to updating your scores!


----------



## renegade1990 (Aug 29, 2008)

*КросСли рулира, продължавайте само така!*


----------



## eidairaman1 (Aug 29, 2008)

kaneda said:


> CUDA is great and all, props to nVidia getting it out there. but where the damned hell is opencl?



that Macntrash language, it wont see fruitition until people start working with it in games, just like open GL of its time. btw Translate that Cyrilic or dont use it at all.


----------



## MrMilli (Aug 29, 2008)

Tatty_One said:


> Thanks Newtekie, what is interesting is that with an ATI card it runs in purely software mode through the CPU as opposed to hardware thru the GPU, but I gather thats just for the bench, you cant actually run GRAW2 in software?
> 
> It will be interesting to see how the other cards do with the bench, I am just about to take over the bench's thread so am looking forward to updating your scores!



I don't know why people are mixing up these stuff.
Physx is just an API. If you don't have Physx hardware, it just runs on software (ie CPU). If you have hardware (Physx card or Geforce), it will use it.
The only thing that some developers have done is create a couple of special levels with higher level of physics and locked it so only people with Physx hardware can play it. That doesn't mean it can't run in software mode, it would just be to slow. Most aren't even locked actually.
Except of Cellfactor, i don't know any other game that really requires Physx hardware.
(You can even hack Cellfactor to run without hardware!)


----------



## chron (Aug 29, 2008)

Hey what kind of cooling are they using? It looks like they have the same vga cooler on both cards...


----------



## wolf2009 (Aug 29, 2008)

Wile E said:


> What if, in vista, you install the entire nVidia driver package with the Physx app, then just uninstall the gfx drivers and leave Physx on there? Wonder if that would work?



that would uninstall the nvidia control panel and cuda files, which are needed to select the physX settings .


----------



## newtekie1 (Aug 29, 2008)

Wile E said:


> What if, in vista, you install the entire nVidia driver package with the Physx app, then just uninstall the gfx drivers and leave Physx on there? Wonder if that would work?



Well, in Vista, the nVidia graphics driver isn't activated when your primary card is an ATi, so uninstalliing it wouldn't really do anything really.  The graphics driver has to be installed and activated for the card to be recognized by the PhysX drivers and be used as a PhysX card.

That is what leads me to believe that there are really only 2 work arounds for nVidia at this point.

1.) Develope a driver for their graphics cards, that isn't a graphics driver.  So the card appears to the OS as just a PhysX card and not a Graphics Adapter.
2.) Get PhsyX working on ATi hardware, so that people running all ATi setups can use PhysX.


----------



## Darkrealms (Aug 29, 2008)

Great for sales of high end ATI and low end Nvidia cards.

Let the frankensteining begin ; P


----------



## Wshlist (Aug 29, 2008)

So why so late and why only hardspell and why relatively little info on drivers and motherboard/chipset used and such?
Why doesn't techpowerup do this test and confirm it?


----------



## btarunr (Aug 29, 2008)

Wshlist said:


> So why so late and why only hardspell and why relatively little info on drivers and motherboard/chipset used and such?
> Why doesn't techpowerup do this test and confirm it?



It's irrelevant which motherboard you use. 2 PCI-E slots is all you'd need.


----------



## Wshlist (Aug 29, 2008)

btarunr said:


> It's irrelevant which motherboard you use. 2 PCI-E slots is all you'd need.



That's what we think, but is it? It would be what you expect but personally I'd also expect to not need any SLI or crossfire board for that sli/crossfire to work but just 2 16x PCIe slots, and yet..


----------



## chron (Aug 29, 2008)

Wshlist said:


> That's what we think, but is it? It would be what you expect but personally I'd also expect to not need any SLI or crossfire board for that sli/crossfire to work but just 2 16x PCIe slots, and yet..



Well you're talking about two video cards working together.  Currently what they're talking about here is the equivalent of throwing in a PCI physx card.  You're simply adding a processor that takes care of the physx code. You could run it at 16x/4x on older boards also. The PCI-e x16 slots dont always need to be used for video.


----------



## Wshlist (Aug 29, 2008)

chron said:


> Well you're talking about two video cards working together.  Currently what they're talking about here is the equivalent of throwing in a PCI physx card.  You're simply adding a processor that takes care of the physx code. You could run it at 16x/4x on older boards also. The PCI-e x16 slots dont always need to be used for video.



I see the logic of course, there's no flaw in it except that it IS a graphics card that reports to the OS/BIOS as a graphics card, and for now even requires graphics card drivers, so although it makes sense and should make sense in practise it might have issues on certain motherboards or chipsets I fear, only testing can tell if it works on any chipset/mobo as you'd expect.


----------



## Mussels (Aug 30, 2008)

wshlist: we already know its the same as using the ageia PCI PPU card. SLI and crossfire boards are not required.

The only problems are getting both drivers to work at the same time.


----------



## Jambul_Er (Aug 30, 2008)

I put HD4850 (catalyst 8.8) and 9600gso (force ware 177.92 +08.08.18 physX). PhysX does not see 9600GSO.
What to do?


----------



## Wshlist (Aug 30, 2008)

If 'we already know' then why is this news and all so vague and only one site tried? If it's the driver I'm sure others can experiment too, in fact I have run a physx demo on my radeon (no nvidia to back it up) which required the cuda.dll. so if it works then the trick is installing some more dll's (if needed) that access the nvidia card, which I imagine is done by separate dll's and you could trace which ones are used by physx surely? In fact don't they have a separate installer for CUDA physx? in that case it must be even less searching for the stuff needed, or perhaps a phone call to nvidia might do it, they like to support cuda on radeon so they'd certainly not mind the idea of people adding nvidia cards to radeon-powered systems I imagine 

Well either way I think techpowerup should try and write a more detailed report, I'm sure they can lay their hands on a radeon and nvidia card for a moment.


----------



## Wshlist (Aug 30, 2008)

Jambul_Er said:


> I put HD4850 (catalyst 8.8) and 9600gso (force ware 177.92 +08.08.18 physX). PhysX does not see 9600GSO.
> What to do?


Thanks for trying jambul, did you install cuda? or the cuda.dll? in other words did you do some experimentation? If so keep us informed please


----------



## btarunr (Aug 30, 2008)

Jambul_Er said:


> I put HD4850 (catalyst 8.8) and 9600gso (force ware 177.92 +08.08.18 physX). PhysX does not see 9600GSO.
> What to do?



Not yet possible on Windows Vista.


----------



## Jambul_Er (Aug 30, 2008)

Thanks for your reply. Only from Russia and I do not know English. Translation through GOOGLE.


----------



## wolf2009 (Aug 30, 2008)

the img is a hidden link to another site


----------



## Jambul_Er (Aug 30, 2008)

btarunr said:


> Not yet possible on Windows Vista.


I have Windows XP SP3


----------



## Jambul_Er (Aug 30, 2008)

wolf2009 said:


> the img is a hidden link to another site



This is our Russian site download images ...


----------



## btarunr (Aug 30, 2008)

Jambul_Er said:


> I have Windows XP SP3



Shut-down, Connect the monitor to the 9600 GSO, start, let the OS detect the display and configure as a second display-head, shut down, connect the monitor back to the HD 4850, start up. So the OS is fooled into thinking there are display-heads configured for both adapters.


----------



## Jambul_Er (Aug 30, 2008)

Wshlist said:


> Thanks for trying jambul, did you install cuda? or the cuda.dll? in other words did you do some experimentation? If so keep us informed please




I only did that already did ... And even dances with diamonds - nothing helps


----------



## SPAWN (Aug 30, 2008)

ктонить в курсе када на радеонку 4870 будут такие дравишки?)))


----------



## Wshlist (Aug 30, 2008)

Jambul_Er said:


> I only did that already did ... And even dances with diamonds - nothing helps



Oh well :/
Someone will discover how it works eventually.


----------



## Hayder_Master (Aug 31, 2008)

everyone here think it is good point for ati , for me i think not , maybe good for users but not for ati , cuz that mean every pc must put in nvidia card in any situation primary or physics , ati must develop a software to solve this biggest problem


----------



## eidairaman1 (Aug 31, 2008)

hence Havoc Engine, Havoc Engine has been around alot longer so its further ahead in the Physics Dept.


----------



## Wshlist (Aug 31, 2008)

eidairaman1 said:


> hence Havoc Engine, Havoc Engine has been around alot longer so its further ahead in the Physics Dept.



Havok's original engine only used the CPU for physics, and then much later after AGEIA came around did they start with a engine that used the GPU, however that was a separate licence that gamedevelopers had to opt for (and pay for) so adoption wasn't very big I think, it's hard to say because nobody knows when people/companies say 'havok physics' if it's using the old CPU licence/SDK, like HL2 does for instance, or their (relatively)newer GPU one, plus I think their GPU one was partly non-interactive, mostly just visual wasn't it? I'm not sure about the details of it.

But either way, if a game uses the one developed by AGEIA, PhysX, like the unreal3 engine, then it doesn't matter if your card has great HAVOK support since it's PhysX that the game requires. And right now I bet lots of developers are opting for the PhysX one since it suddenly has a lot of people that can use it, unless of course they are smart like the crysis makers and just make their own physics engine and bypass all the hassle  Although doing that on the GPU might be harder to develop than you think *shrug*


----------



## Fitseries3 (Aug 31, 2008)

quick question.....

i have a board with 3 full length pcie slots, 4870x2 and 4870 in CFX and i am thinking about getting a 9800gt for physx. the reason for the 9800gt is because it's the fastest single slot card i can think of and it will work perfectly in between my 2 ATI beasts.

is this a good move for me or should i do something else with my time/money?


----------



## btarunr (Aug 31, 2008)

fitseries3 said:


> quick question.....
> 
> i have a board with 3 full length pcie slots, 4870x2 and 4870 in CFX and i am thinking about getting a 9800gt for physx. the reason for the 9800gt is because it's the fastest single slot card i can think of and it will work perfectly in between my 2 ATI beasts.
> 
> is this a good move for me or should i do something else with my time/money?



The performance of PhysX isn't all that proportional to the GPU computational power beyond maybe a 8800 GS 384MB. IIRC the third long slot is PCI-E x4, is it?


----------



## Fitseries3 (Aug 31, 2008)

4x when 3 cards are in yes. 8x when only 2 are used.

i can get a 9800gt for like $40 so it's not a price thing... just availability.


----------



## btarunr (Aug 31, 2008)

fitseries3 said:


> 4x when 3 cards are in yes. 8x when only 2 are used.
> 
> i can get a 9800gt for like $40 so it's not a price thing... just availability.



It also becomes a heat and power-draw thing


----------



## DarkMatter (Aug 31, 2008)

btarunr said:


> It also becomes a heat and power-draw thing



With 3 R770 on his system, I don't think the power/heat of a 9800GT is an issue for him.


----------



## btarunr (Aug 31, 2008)

DarkMatter said:


> With 3 R770 on his system, I don't think the power/heat of a 9800GT is an issue for him.



Point is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.


----------



## DarkMatter (Aug 31, 2008)

btarunr said:


> Point is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.



I agree, but the extra power required to go from ther GS to the GT is NOTHING compared to the power draw he already has. Sure it is pointlees if he will get the same performance, but we don't really know how much power it's going to be required in the next 6 months = 50+ new titles.

And then it's the $40 argument. If he can get a 9800GT for that money I would never never tell him to get something slower. He could even had to pay more for a GS!!


----------



## btarunr (Aug 31, 2008)

I don't think those 50+ titles have PhysX content that would make a dedicated PhysX unit such as 8800 GS sweat. 96 NVIDIA SPs is still a huge amount of rated shader compute power. 192 bit memory bus doesn't matter, the card isn't transferring large chunks of data (such a textures), it's just crunching lots of math in real-time. To look at it that way, if a PhysX title does have physics load that makes a dedicated 8800 GS sweat, a single  GTX 280 machine is in for a significant graphics performance hit.

If he isn't getting a 8800 GS for less than $40 bucks, 9800 GT is cool.


----------



## Fitseries3 (Aug 31, 2008)

btarunr said:


> Point is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.



i can get a 9800gt new in box for $40 but i'd have to pay retail for a 8800.

heat.... fuck heat... who cares anyway? it's for benching and i have some 130cfm fans i use to cool the vid cards anyway.


----------



## DarkMatter (Aug 31, 2008)

btarunr said:


> I don't think those 50+ titles have PhysX content that would make a dedicated PhysX unit such as 8800 GS sweat. 96 NVIDIA SPs is still a huge amount of rated shader compute power. 192 bit memory bus doesn't matter, the card isn't transferring large chunks of data (such a textures), it's just crunching lots of math in real-time. To look at it that way, if a PhysX title does have physics load that makes a dedicated 8800 GS sweat, a single  GTX 280 machine is in for a significant graphics performance hit.
> 
> If he isn't getting a 8800 GS for less than $40 bucks, 9800 GT is cool.



I think we both know each others points, but we just has focused the thing in a different way. This is how I see it, in order of importance:

- Price: IMHO, a 9800GT for $40 is must have. Period. 

-Power consumption: the difference between both cards is 10W. The X2 alone consumes 300W. Add 150W for the HD4870 and 120W for the rest of the system, as well as 75 for the baseline card (GS) and we are talking about 645W under full load. 645 or 655 who cares?

- Performance: probably in the next year a GT will not get you better performance than the GS, maybe not even in 2 years? Who cares? You won't need to change the card. IMO you can't apply the same criteria as with graphics, where a little underpowered card makes more sense because you will need to upgrade it soon anyway. And if we take into account the price, are you really going to risk the future performance, or the possibility that you will need to upgrade the card a lot sooner in order to get a cheaper or a bit less power hungry card?


----------



## Fitseries3 (Aug 31, 2008)

im also looking at resell value here. i can get the 9800gt and use it for a few months and still get more outta it than i paid.

not trying to dog on your btarunr


----------



## btarunr (Aug 31, 2008)

fitseries3 said:


> i can get a 9800gt new in box for $40 but i'd have to pay retail for a 8800.
> 
> heat.... fuck heat... who cares anyway? it's for benching and i have some 130cfm fans i use to cool the vid cards anyway.



Then price is the only issue, go for it.


----------



## btarunr (Aug 31, 2008)

DarkMatter said:


> - Performance: probably in the next year a GT will not get you better performance than the GS, maybe not even in 2 years? Who cares? You won't need to change the card. IMO you can't apply the same criteria as with graphics, where a little underpowered card makes more sense because you will need to upgrade it soon anyway. And if we take into account the price, are you really going to risk the future performance, or the possibility that you will need to upgrade the card a lot sooner in order to get a cheaper or a bit less power hungry card?



2 years from now you think the industry will let you use a 9800 GT for PhysX? You'll be hit by the standard-syndrome, they'll come up with "The latest PhysX engine requires a CUDA < insert advanced version here > -supportive graphics card", naturally all existing hardware will become 'obsolete'. Anyway, that's Fit's we're talking about. His hardware changes like the weather


----------



## Fitseries3 (Aug 31, 2008)

what a really want to know is if it is worth the $40 or should i get something that has nothing to do with gfx/physx.


----------



## btarunr (Aug 31, 2008)

fitseries3 said:


> what a really want to know is if it is worth the $40 or should i get something that has nothing to do with gfx/physx.



Something that can put those cards to use, a game.    ...if not 9800 GT though for $40 nothing beats it.


----------



## Fitseries3 (Aug 31, 2008)

i dont game though.


----------



## DarkMatter (Aug 31, 2008)

fitseries3 said:


> i dont game though.



Then IMO that money will be better spent paying some drinks to your friends or something. Maybe sending it to the DarkMatter, he will surely give it good use...


----------



## wolf2009 (Aug 31, 2008)

fitseries3 said:


> i can get a 9800gt new in box for $40 but i'd have to pay retail for a 8800.



Were can you get it from for that money ?


----------



## OnBoard (Aug 31, 2008)

fitseries3: how about a 9600GT? (if you can get 9000 series cheap). They used it in here: http://www.guru3d.com/article/physx-by-nvidia-review/

If I could get a 9800GT for $40 I'd swap my 8800GT, even though they are the same  (well with luck 55nm version).

Fun to break stuff in warmonger, don't know if there is any other use at the moment. Runs surprisingly well with just my single card.

edit: oh and the 9800GT can always be downclocked for less power and heat, power is still plenty.


----------



## wolf2009 (Aug 31, 2008)

OnBoard said:


> fitseries3: how about a 9600GT? (if you can get 9000 series cheap). They used it in here: http://www.guru3d.com/article/physx-by-nvidia-review/
> 
> If I could get a 9800GT for 40$ I'd swap my 8800GT, even though they are the same  (well with lucn 55nm version).



its not 55nm .


----------



## Fitseries3 (Aug 31, 2008)

9800gt = 8800gt?


----------



## OnBoard (Aug 31, 2008)

fitseries3 said:


> 9800gt = 8800gt?



Yep, just renamed and hybrid power support added. Most 9800GT's are even slower than my 8800GT stock.. 



wolf2009 said:


> its not 55nm .


Not the 9600GT, but some 9800GT's are and probably will soon be all, like the 9800GTX+.


----------



## Fitseries3 (Aug 31, 2008)

so 9800gt is a bad choice? 9600gt better?


----------



## wolf2009 (Aug 31, 2008)

OnBoard said:


> Yep, just renamed and hybrid power support added. Most 9800GT's are even slower than my 8800GT stock..
> 
> 
> Not the 9600GT, but some 9800GT's are and probably will soon be all, like the 9800GTX+.



yup , but not right now .


----------



## wolf2009 (Aug 31, 2008)

fitseries3 said:


> so 9800gt is a bad choice? 9600gt better?



9800GT is better choice . , but where the hell would you find it for $40 ?


----------



## OnBoard (Aug 31, 2008)

Not a bad choice, just that the single slot cooling goes for the near 100C load numbers (well probably not in just PhysX use). 9600GT would run a bit cooler and draw less power and still be fast enough for every physics currently out there.

edit: I'll take that back. It's a MASSIVE 9W load difference with 9800GT and 9600GT  So yeah, the 9800GT is just fine.
http://www.techpowerup.com/reviews/Zotac/GeForce_9800_GT_Amp_Edition/24.html


----------



## Fitseries3 (Aug 31, 2008)

i cant sell them for $40 but i can get you a good deal on one.


----------



## wolf2009 (Aug 31, 2008)

fitseries3 said:


> i cant sell them for $40 but i can get you a good deal on one.



YHPM


----------



## DarkMatter (Aug 31, 2008)

OnBoard said:


> Not a bad choice, just that the single slot cooling goes for the near 100C load numbers (well probably not in just PhysX use). 9600GT would run a bit cooler and draw less power and still be fast enough for every physics currently out there.
> 
> edit: I'll take that back. It's a MASSIVE 9W load difference with 9800GT and 9600GT  So yeah, the 9800GT is just fine.
> http://www.techpowerup.com/reviews/Zotac/GeForce_9800_GT_Amp_Edition/24.html



I've been talking about the small differences all the time. 

Anyway the 8800GS/9600GSO would be a lot better physx card.


----------



## JRMBelgium (Sep 1, 2008)

Exactly what Nvidia wanted...and you guys fall for it...


----------



## DarkMatter (Sep 1, 2008)

Jelle Mees said:


> Exactly what Nvidia wanted...and you guys fall for it...



I don't get it. We fall for what?


----------



## wolf2009 (Sep 1, 2008)

DarkMatter said:


> I don't get it. We fall for what?



Nvidia wants to win back discrete card market share , and with AGEIA acquisition they can attack ATI on two fronts . Graphics and this PhysX . Nobody knows how popular is it going to get, but everyone wants to buy a Nvidia card to run those few games .


----------



## insider (Sep 1, 2008)

They won't, software based physics engine is still by far the most popular, Intel bought the company that is the industry leader in gaming physics engine widely used in games, AMD/ATI is also following this approach since they are a CPU/GPU company as well...

PhysX is pretty much a gimmick now like it has always been (probably dying a slow death) when both Intel and AMD/ATI are using their multi core CPU's to assist on in-game physics. 

The processing power involved isn't that demanding at all, using a 2nd graphics card solely for PhysX is a complete waste of time when any multi-core CPU could handle it without a single hint of slowdown at all.


----------



## DarkMatter (Sep 1, 2008)

insider said:


> They won't, software based physics engine is still by far the most popular, Intel bought the company that is the industry leader in gaming physics engine widely used in games, AMD/ATI is also following this approach since they are a CPU/GPU company as well...
> 
> PhysX is pretty much a gimmick now like it has always been (probably dying a slow death) when both Intel and AMD/ATI are using their multi core CPU's to assist on in-game physics.
> 
> The processing power involved isn't that demanding at all, using a 2nd graphics card solely for PhysX is a complete waste of time when any multi-core CPU could handle it without a single hint of slowdown at all.



Again, :shadedshu we are obviously not talking about the same physics. CPU can't and never will be able to handle the kind of physics that GPUs or PPU can. The demos and games that have been already released show how good PhysX can be. It is not a gimmick at all. And those demos and games can run easily on my card when deliverately downclocked the shaders to embarrasing clocks to test how much of the card was actually being used. The veredict is that my GT can handle a lot lot lot more physics. The future will just be so much better IMO. Ati very inteligently said they COULD end up using Ageia if it is successful, because they knew it's a good thing. They are just hoping game developers don't adopt it to the point that GPU acceleration is required.



wolf2009 said:


> Nvidia wants to win back discrete card market share , and with AGEIA acquisition they can attack ATI on two fronts . Graphics and this PhysX . Nobody knows how popular is it going to get, but everyone wants to buy a Nvidia card to run those few games .



That's not fall for anything. If buying Nvidia cards is the only way to have hardware accelerated physics, so be it. They are not fooling anyone. Besides PhysX on Ati cards is possible but is Ati's decision to not adopt it. They could make their own if they don't like it, and try making developers use it. As of now Nvidia is the only one innovating in this front and whoever wants more than mediocre physics will have to use their hardware. It's simple.

EDIT: BTW you knew that Ageia aproached AMD/Ati long before Nvidia isn't it? Ati had the chance and they didn't took that train.


----------



## eidairaman1 (Sep 1, 2008)

That's not fall for anything. If buying Nvidia cards is the only way to have hardware accelerated physics, so be it. They are not fooling anyone. Besides PhysX on Ati cards is possible but is Ati's decision to not adopt it. They could make their own if they don't like it, and try making developers use it. As of now Nvidia is the only one innovating in this front and whoever wants more than mediocre physics will have to use their hardware. It's simple.


To another Point, Havoc is the alternative to Physx. TBH, Id rather not buy hardware that supports a Function such as Physx until there is actuall Tangible Software out there that utilizes it, not just a handful but when majority have that technique, by the time Actuall Stuff comes to fruitition it will be time to upgrade again= which makes buying that piece of hardware a waste of time, im sorry future proofing is not in my vocabulary.


----------



## Wshlist (Sep 1, 2008)

ATI actually announced that they won't go for their own or nvidia's but will focus on the directX11 (which reportingly has physics) and OpenCL (note the C not G in there) variants, and perhaps nvidia will have to follow suit, it's hard to argue with DirectX really, even when nvidia tries from time to time.

It's a bit trange how ATI still has to decide paths and don't seem to have resources to follow 2 paths, even after AMD bought them, a company that tossed billions of dollars around like it's going out of style.. well actually, the dollar is going out of style I guess


----------



## DarkMatter (Sep 1, 2008)

eidairaman1 said:


> That's not fall for anything. If buying Nvidia cards is the only way to have hardware accelerated physics, so be it. They are not fooling anyone. Besides PhysX on Ati cards is possible but is Ati's decision to not adopt it. They could make their own if they don't like it, and try making developers use it. As of now Nvidia is the only one innovating in this front and whoever wants more than mediocre physics will have to use their hardware. It's simple.
> 
> 
> To another Point, Havoc is the alternative to Physx. TBH, Id rather not buy hardware that supports a Function such as Physx until there is actuall Tangible Software out there that utilizes it, not just a handful but when majority have that technique, by the time Actuall Stuff comes to fruitition it will be time to upgrade again= which makes buying that piece of hardware a waste of time, im sorry future proofing is not in my vocabulary.



Future proofing? I hope that at least 2 of the 50 PhysX titles that are going to be released n the coming months will be worth a try. Those with Ati hardware can do 3 things:

- Buy a Nvidia card and enjoy the extra physics.

- Don't buy anything and argue about something they don't want, and they don't understand, primarily because they can't have it.

-Enjoy the game as they can play, not acting like a childish, crying for something they can't have.

PhysX is just a FREE added value for those who bought their hardware. If you don't have it and don't want to benefit from great physics, then continue as you are today. But why argue with something yo say you don't care about?

I remember so many people saying the same about the first graphics cards that is actually so funny... I'm not saying it will be as successful, but it has the potential.

Besides Havok is the alternative, but not the competition. It has nothing to do against Ageia, because where it is computed. LOL Havok, UT games, Half-LIfe 2, Oblivion... CRAP, CRAP, CRAP. The physics I mean. It had to come Crytek and make a better physics engine on their own, even if they are supposedly not experts.


----------



## DarkMatter (Sep 1, 2008)

Wshlist said:


> ATI actually announced that they won't go for their own or nvidia's but will focus on the directX11 (which reportingly has physics) and OpenCL (note the C not G in there) variants, and perhaps nvidia will have to follow suit, it's hard to argue with DirectX really, even when nvidia tries from time to time.
> 
> It's a bit trange how ATI still has to decide paths and don't seem to have resources to follow 2 paths, even after AMD bought them, a company that tossed billions of dollars around like it's going out of style.. well actually, the dollar is going out of style I guess



I'm not so sure about GPU physics on DX11. GPU compute is going to be there AFAIK, but from there to a good physics API there's a long way. DX11 will launch a lot later and taking into account the adoption rate of DX's, we wouldn't have hardware accelerated physics until 2011. No thanks, give the physics now, and I myself will decide if I want it or not.

EDIT: after some searching I have found that Ati said that before Ageia's adquisition. And TBH there is a difference. Ageia had a hardware base of not more than 100.000 PPUs, Nvidia has over 50 million capable cards and 55-60% market share. You can't compare, it does pay off to implement good physics in your game if you know it could mean an advantage over other games when you have so much potential buyers.


----------



## eidairaman1 (Sep 1, 2008)

Look what happened Ageia, it never took off, Nvidia bought them up, they gotta be careful they dont have the same fate, but i guess Nvidia would drop it like a whore if it flops, before they go under.


DarkMatter said:


> Future proofing? I hope that at least 2 of the 50 PhysX titles that are going to be released n the coming months will be worth a try. Those with Ati hardware can do 3 things:
> 
> - Buy a Nvidia card and enjoy the extra physics.
> 
> ...


----------



## DarkMatter (Sep 1, 2008)

eidairaman1 said:


> Look what happened Ageia, it never took off, Nvidia bought them up, they gotta be careful they dont have the same fate, but i guess Nvidia would drop it like a whore if it flops, before they go under.



Man, understand this:

less than 100.000 PPUs = big flop as no game developer will care to make separate code if the hardware base is only 10% of the number of game copies they try to sell. Yet Ageia managed to convince some developers!! Speaks volumes to the quality of the feature!

more than 50 million capable GPUs (and potentially a lot more to come) = a developer only needs to sell the game to 2% of the installed base to become the most successful PC game in late years. IF the game really stands out IMO they can sell a lot more than that 2% of Nvidia owners EASILY.


----------



## Wshlist (Sep 1, 2008)

DarkMatter said:


> I'm not so sure about GPU physics on DX11. GPU compute is going to be there AFAIK, but from there to a good physics API there's a long way. DX11 will launch a lot later and taking into account the adoption rate of DX's, we wouldn't have hardware accelerated physics until 2011. No thanks, give the physics now, and I myself will decide if I want it or not.
> 
> EDIT: after some searching I have found that Ati said that before Ageia's adquisition. And TBH there is a difference. Ageia had a hardware base of not more than 100.000 PPUs, Nvidia has over 50 million capable cards and 55-60% market share. You can't compare, it does pay off to implement good physics in your game if you know it could mean an advantage over other games when you have so much potential buyers.



They actually said it early august this year:

"in his speech GPG CTO Technology Day held in Iceland’s capital, Raja Koduri, CTO of AMD GPG (ex-ATI), announced that AMD believes that the time for proprietary software solutions such as AMD's own Close-to-Metal and Nvidia's CUDA has passed.

As a result, AMD will throw its efforts behind DirectX 11 Computational Shaders and the OpenCL GPGPU language and will focus on standardized solutions only."

Of course it's true that that is not mentioning physics, and also it's not quite clear if MS will do anything in that direction or offer encouragement or support for physics on the GPU via DX11, on the other hand ATI could use the GPU-computational part of DX11 and write a physics plugin for it, theoretically, as could nvidia, or nvidia could extend CUDA to work on top of DX11 so they can smoothly move their PhysX api to DX11 and have no need to expose their users to big changes in that area.
And then there's the DX11 platform.. it'll be vista/windows7 and not XP, which most people still prefer and have.
So I guess you are right, PhysX seems to be the winner and we'll have to get nvidia cards as assist or someone should make a PhysX driver for ATI.

Personally I pull more towards getting a nvidia card as secondary right now, as soon as I know if that's practical, because some dubious pictures on hardspell isn't quite the same as knowing it works, and works smoothly, to mix cards that is, for all I know it's fraught with all kinds of issues and/or requires hacks and instability I'm not willing to accept.


----------



## eidairaman1 (Sep 1, 2008)

about windows 7, i will wait a few months for reviews, if they are excellent, ill switch, but this machine here will be XP only.


----------



## insider (Sep 1, 2008)

DarkMatter said:


> Again, :shadedshu we are obviously not talking about the same physics. CPU can't and never will be able to handle the kind of physics that GPUs or PPU can.



Yes we are, it's called Havoc, it's been around for a long time which Intel bought up for further optimising for their CPU/GPU platforms, there is nothing PhysX can do that can't be done in the latest software based Havoc physics engine with a fast Quad in software.  It will soon be optimised for the hybrid CPU/GPU platforms.

If a dedicated PPU was a must then both Intel and AMD would have taken that route long ago, nVidia doesn't have a CPU/GPU platform to run on.

The initial hybrid CPU/GPU chips will have a relatively weak GPU, but the integrated GPU can be used to process in-game physics only, the dedicated graphics card handles everything else.


----------



## DarkMatter (Sep 1, 2008)

insider said:


> Yes we are, it's called Havoc, it's been around for a long time which Intel bought up for further optimising for their CPU/GPU platforms, there is nothing PhysX can do that can't be done in the latest software based Havoc physics engine with a fast Quad in software.  It will soon be optimised for the hybrid CPU/GPU platforms.
> 
> If a dedicated PPU was a must then both Intel and AMD would have taken that route long ago, nVidia doesn't have a CPU/GPU platform to run on.
> 
> The initial hybrid CPU/GPU chips will have a relatively weak GPU, but the integrated GPU can be used to process in-game physics only, the dedicated graphics card handles everything else.



No, we aren't. I think you have never seen or understood what PhysX is about. The Havok API is as good or probably better than PhysX, I really don't know and I won't argue about that. BUT it only runs on CPU and CPUs don't have the kind of power that GPUs have now, but most importantly in the near future it will go worse (for the CPU). A 8400 GS has already a comparable power to that of a Quad core. Considering how you need the CPU for others tasks (AI, driving the renderer, the sound, the input...) a dedicated 8400 GS is well ahead. A very OCed Quad could match the 8400GS in performance, but wouldn't be able to touch the 8600GT/9500GT. Very few people have Quads yet, motherboards with 2 or 3 PCI Express slots are very common on the other hand. Tell me, what it is going to be easier and cheaper for the people in order to enable a lot better physics than what we have today? Throw away their C2D and buy a Quad ($200), then overclock the hell out of it. Or buy a 8600/9500GT for $50 and install some drivers?

As months pass it will be even better for the GPU solution. Nehalem is only 30% faster than Core2 clock for clock so it won't improve performance by much. How higher do you think they will go in clocks? On the other hand we know we will have the 9600GT/GSO even 9800GT or comparable cards in the $50-$100 segment really soon. 1.3X a current Quad for $200+ or 2X-3X a current low-end card? 

EDIT: Intel didn't bought Havok to optimize it for their CPUs. They are the same Havok team who are doing the work and won't be able to optimize much further than what they did before. Anyway, Intel bought Havok to implement it on Larrabee or derivated enbedded solutions. So that's again Physics on a GPU, but Intel's will not come until 2010. Ageia is now.


----------



## insider (Sep 1, 2008)

My old PCI PhysX card was just that, a £80 piece of gimmick that did very little. 

It was dead before nVidia bought up the dying Ageia company, and you are forgetting AMD's Fusion is only around the corner costing little more to integrate into the their CPU cores.

The industry has ignored the Ageia PhysX engine for years now, Havoc was bought BTW so intel can implement it on their Integrated GPU and a far cheaper solution than a dedicated card, hint hint.

The new Havoc hybrid CPU/GPU physics engine will still be the industry standard full stop.


----------



## DarkMatter (Sep 1, 2008)

insider said:


> My old PCI PhysX card was just that, a £80 piece of gimmick that did very little.
> 
> It was dead before nVidia bought up the dying Ageia company, and you are forgetting AMD's Fusion is only around the corner costing little more to integrate into the their CPU cores.
> 
> ...



1- Fusion is still GPU. Havok will NOT do GPU physics anytime soon. IMO it will never do, it will always be x86 based, so that Larrabee derivatives have the edge.

2- The industry might have ignored Ageia IN THE PAST. More than 50 titles coming in the next months doesn't sound like they are ignoring it. I wouldn't be surprised if Ageia has more titles than Havok right now.

3- We don't know if Havok will continue in the lead. It probably isn't right now, let alone in the future. As I said NO "GPU"  Havok (will always be based on x86, so sorry Ati) until Larrabee and that's 2010. Also the physics API used in the PS3 is PhysX, AFAIK XBox360 doesn't have any one defined. That makes a lot easier for developers to use Ageia for cross-platform compatibility. 

Only time will tell, but as far as better physics is the concern, PhysX is the only way to go until 2010: Larrabee, DX11. And then it's going to be very easy for Nvidia/Ageia to move to DX11 physics. Nvidia has physics now, Intel and AMD only have promises for the future. And those promises are not about making something better, they just promise they will have the same as Ageia has right now.


----------



## Wile E (Sep 2, 2008)

insider said:


> My old PCI PhysX card was just that, a £80 piece of gimmick that did very little.
> 
> It was dead before nVidia bought up the dying Ageia company, and you are forgetting AMD's Fusion is only around the corner costing little more to integrate into the their CPU cores.
> 
> ...


The industry ignored Physx in the past, but you are forgetting that nVidia has much more pull in the gaming market than a small company like Aegia did. Not only that, but you had to buy additional hardware for Physx to work in the past. Now, many people do not have to do that, because they already have a capable card if they own a card based on G80 or newer. That's a much larger installed user base than the original Aegia. That in and of itself makes it a much more tempting for developers to start using the API, as now, they have little to lose.

And the performance of Havoc on an Intel integrated GPU will never approach the performance capabilities of Physx on a dedicated NV card. Hell, it will never approach the performance of Physx running on Integrated NV graphics.


----------



## Wshlist (Sep 2, 2008)

Intel is moving into graphics you know, they are making their own full-fledged GPU, not that dinky intel-onboard crap, and then they plan to bake that on the same die as the CPU I gather, at that point they might well walk all over nvidia.


----------



## Mussels (Sep 2, 2008)

Wshlist said:


> Intel is moving into graphics you know, they are making their own full-fledged GPU, not that dinky intel-onboard crap, and then they plan to bake that on the same die as the CPU I gather, at that point they might well walk all over nvidia.



we are all quite aware of intels larrabee plans here on TPU, its not going to compete with Nvidia and ATI at all in terms of GPU power, its going to be used in an entirely different market with its x86 capabilities.


----------



## Wile E (Sep 2, 2008)

Mussels said:


> we are all quite aware of intels larrabee plans here on TPU, its not going to compete with Nvidia and ATI at all in terms of GPU power, its going to be used in an entirely different market with its x86 capabilities.



And to add to that, it has no effect on the market  right now. It's not due out for another 2 years. That gives Physx plenty of time to get a foothold in the market.


----------



## Wshlist (Sep 2, 2008)

Well 2 years or not I don't think intel is going to try to use havok on their current integrated stuff are they now, you would not even be able to tell if it had physics at 0.3 FPS, or you'd have to timelapse record it and then replay it at normal speed to enjoy it 

Incidentally, I'm not sure why people buy that physics stuff anyway, I think newton put it in public domain back when.., basically any programmer should be able to code in some physics, and as for physics on the GPU, well the old GPU's were new and complex to put stuff like that on, but now that the world of gameprogrammers knows a lot about the GPU and the GPUs expanded their capabilities and much more documentation and examples are available  I'm guessing it can't be that hard anymore to put your physics calculations on the GPU.


----------



## DarkMatter (Sep 2, 2008)

Wshlist said:


> Incidentally, I'm not sure why people buy that physics stuff anyway, *I think newton put it in public domain back when.., basically any programmer should be able to code in some physics*, and as for physics on the GPU, well the old GPU's were new and complex to put stuff like that on, but now that the world of gameprogrammers knows a lot about the GPU and the GPUs expanded their capabilities and much more documentation and examples are available  I'm guessing it can't be that hard anymore to put your physics calculations on the GPU.



 I found that quite funny. Many specialists in history of cience believe Sumerians invented maths and basic trigonometry back in 5000 BC (Egiptians used them in 3000 BC, anyway), so I guess anyone should be able to make CPUs, GPUs, renderers? 

One thing is to put some known formulas into computer code, and another thing is to make a good (usable and complete) engine. Then the next level is to make it perform on real time. And one more step is where Ageia and Havok are.

Recently I decided it was the time to deal a bit with OpenGL programming, even though I don't like programming, just to use the coding "skills" I gathered at the uni and understand everything related to graphics a lot better. Just in one afternoon looking at it's internals (after another one remembering C/C++), I am pretty sure how to do a renderer. It's "easy". As easy as to know how to make a painting or an sculpture. Will I ever do a painting or an sculpture or an usable renderer? I'm going to give it a try, but don't have big hopes. You get the idea?


----------



## Wshlist (Sep 2, 2008)

I did use newton because he's know for newtonian physics obviously and that works in the joke (and in simple programming), but he's not the only/first by any means.

And I know people who started programming from scratch and guess what? they added 'physics' by using the simple well know formulas from pythagoras and newton etcetera every kid learns on school, and it worked out fine, if you code an engine, which would be the people that buy licences of havok, then the added difficulty of 'physics' in minor really, you already define objects with 3d coordinates so to have some basic formulas like the simple gravity one in their behaviour isn't that high-brow really and I still don't see why you'd have to buy 3rd party stuff for it, in fact neither did the people that made crysis, they just added their own routines.

As for adding it to hardware, you think the intel engineers can't figure out basic trigonometry and the laws of physics? And what do you think is in those SSE[x] instructions anyway?

Plus to make ANY kind of game engine you must have collision detection and bounce andsoforth, that's not optional, not even in a 2D game.


----------



## DarkMatter (Sep 2, 2008)

Wshlist said:


> I did use newton because he's know for newtonian physics obviously and that works in the joke (and in simple programming), but he's not the only/first by any means.
> 
> And I know people who started programming from scratch and guess what? they added 'physics' by using the simple well know formulas from pythagoras and newton etcetera every kid learns on school, and it worked out fine, if you code an engine, which would be the people that buy licences of havok, then the added difficulty of 'physics' in minor really, you already define objects with 3d coordinates so to have some basic formulas like the simple gravity one in their behaviour isn't that high-brow really and I still don't see why you'd have to buy 3rd party stuff for it, in fact neither did the people that made crysis, they just added their own routines.
> 
> ...



So you didn't catch my point eh?

"Anyone" can program it, of course. But a completely different thing is to make a competent product. Why do you think people use 3rd party engines/renderers? Do you honestly believe a graphics engine is more complicated? IMO NO, that's why I said what I said in my previous post. 

Also you have to put a lot of resources (money, man/time power etc.) into it. 3rd party middleware has only need to be done once and the expenses of making and updating it are paid by many developers , and not just one (when buying it, of course). Hell, by your logic, with the things I learnt at school/uni I could build my own house, almost everybody could, but we let architects and engineers do our houses don't we? There's a reason for that, and there is a reason for middleware (be it physics or game engines) to exist.

EDIT: And yes, Intel (Havok) can also do a good engine, they already have a good one. BUT as long as it is run on software it will never compete with Ageia. I mean it's simple to understand IMO.

Anyway the better answer it would have been. If anyone (I mean, let's say Intel, Nvidia) can do their own competitive engine why did they bought the physics companies? Intel with ALL THE MONEY and all the resources prefered to buy Havok instead. Think about that.


----------



## Wshlist (Sep 3, 2008)

intel at some point suddenly desperately started to want to get into gaming for some reason, any way they can, even supporting in-game ads as the first ones to do so, and trying to make a (real)GPU and buying gamerelated companies, my guess is that they realised that there's tons of money in that world and/or they have a CEO that fancies the idea.

And no you aren't convincing me, sure it costs a little extra time(and therefore money) to get good physics in an engine, but you could just as easily argue 'why code an engine, why not simply buy the unreal one or something', point for me is that there's no need to buy physics for it perse, it's an option and it cuts some time, but that's all, it's not superhard to get your own physics going with the advantage that you are in control and know what you are doing and can add your own twist, and sell your engine to 3rd parties WITH physics without having to tell your customers 'Oh btw, you have to pay some other company money for the physics part of our engine', or having to buy a very expensive license that allows re-sale as part of your engine.
And yes, I think in fact a graphics engine from scratch IS more complicated than the physics part of an engine.


----------



## DarkMatter (Sep 3, 2008)

Wshlist said:


> intel at some point suddenly desperately started to want to get into gaming for some reason, any way they can, even supporting in-game ads as the first ones to do so, and trying to make a (real)GPU and buying gamerelated companies, my guess is that they realised that there's tons of money in that world and/or they have a CEO that fancies the idea.
> 
> And no you aren't convincing me, sure it costs a little extra time(and therefore money) to get good physics in an engine, but you could just as easily argue 'why code an engine, why not simply buy the unreal one or something', point for me is that there's no need to buy physics for it perse, it's an option and it cuts some time, but that's all, it's not superhard to get your own physics going with the advantage that you are in control and know what you are doing and can add your own twist, and sell your engine to 3rd parties WITH physics without having to tell your customers 'Oh btw, you have to pay some other company money for the physics part of our engine', or having to buy a very expensive license that allows re-sale as part of your engine.
> And yes, I think in fact a graphics engine from scratch IS more complicated than the physics part of an engine.



I don't know why you are here, losing the time. Go do your own API and get rich. 95% of new games and developers use third party physics engines as well as third party game engines, or at least renderers. FYI NO a physics engine is not easier by any means, but if you feel it's easy do it yourself. Take those talented friends of yours you mentioned and do your own physics engine. 

BTW then post here some betas, I'm sure most people here are willing to be your indian pigs for testing an engine made by a member. I'm already listed, so send me that beta first! 

EDIT: Industry heavyweights as John Carmack, Tim sweeny, Tom Hall, Gabe Newell, George Broussard, Sid Meier (and a very long list of other ones that don't come to mind right now) make their own game engines, but all of them license third party physics. If you manage to make one, you can be proud of being smarter than them, so kudos.


----------



## Wshlist (Sep 3, 2008)

So that's it then? Your counter'argument? oh well..


----------



## DarkMatter (Sep 3, 2008)

Wshlist said:


> So that's it then? Your counter'argument? oh well..



Yeah that's it, as I don't have arguments against your overwhelming superior logic... 
You are right, I am wrong. 95% of game developers are wrong. Are nothing more than a bunch of retarded bunnies, who know nothing about the bussiness in which they are working for more than 20 years now.

Well at least you opened my eyes. Please can you give me some clues as to what is the better way to make my own cloth, furniture and tools? Because I'm almost sure at this point that you make all those things for your own, since they are so much easier to do than the things we are talking about... I am sure you don't care to spend half an hour everyday for that purposes. Yeah now I understand, why would I spend money on those things when I can make them myself, I've been so stupid until now...


----------



## eidairaman1 (Sep 3, 2008)

alright you 2 enough of the arguing, its not worth fighting over it as neither of you are getting anywhere


----------



## Wshlist (Sep 4, 2008)

DarkMatter said:


> You are right, I am wrong. 95% of game developers are wrong. Are nothing more than a bunch of retarded bunnies, who know nothing about the bussiness in which they are working for more than 20 years now.



Did you know that 87.9% of the percentages are made up?

And that 97% of the people agree with me?

If you didn't now you know


----------



## eidairaman1 (Sep 4, 2008)

I said enough is enough, dont try to escalate this, continuation of your argument will result in contact to a mod and a censure to your actions. Also lets get back on Topic of the Matter, "not My E-Penis is bigger than yours" (childish arguments)


----------

