# NVIDIA Announces PhysX and APEX Support for Sony PlayStation 4



## btarunr (Mar 7, 2013)

NVIDIA today announced support for Sony Computer Entertainment's PlayStation4 with the popular NVIDIA PhysX and NVIDIA APEX software development kits (SDKs). Game designers use PhysX and APEX technologies for collision detection and simulation of rigid bodies, clothing, fluids, particle systems and more across a wide range of platforms, including desktop PCs, game consoles, and mobile and handheld devices.

NVIDIA PhysX technology is the world's most pervasive physics solution for designing real-time, real-world effects into interactive entertainment titles. The PhysX development environment gives developers unprecedented control over the look of their final in-game interactivity.



Taking PhysX technology content creation to the next level, NVIDIA APEX technology lets artists create intricate physics-enabled environments. They can expand the quantity and visual quality of destructible objects; make smoke and other particle-based fluids integral to game play; and create life-like clothing that interacts with the character's body to achieve more realism in their games.

"Great physics technology is essential for delivering a better gaming experience and multiplatform support is critical for developers," said Mike Skolones, product manager for PhysX at NVIDIA. "With PhysX and APEX support for PlayStation4, customers can look forward to better games."

NVIDIA PhysX and APEX technologies are designed to run on a variety of CPU architectures and can be accelerated by any CUDA architecture-enabled NVIDIA GPU, GeForce 8-series or higher.

*View at TechPowerUp Main Site*


----------



## dj-electric (Mar 7, 2013)

Really? ..... Really?!?!

The irony, is just amazing with this one.


----------



## btarunr (Mar 7, 2013)

Proof that PhysX can be implemented on AMD Graphics CoreNext if NVIDIA really wanted. With a swelling list of PC games falling under AMD's various developer programs, the only way PhysX can survive is getting open.


----------



## Prima.Vera (Mar 7, 2013)

Oh, the irony. So Physx on AMD hardware is more than possible, but not for PCs...


----------



## W1zzard (Mar 7, 2013)

The PR is not clear on whether it will run PhysX on the PS4 shaders or CPU. Already sent an email to NVIDIA with request for clarification


----------



## T3RM1N4L D0GM4 (Mar 7, 2013)

Will it run on the cpu (meh :v ) or (we all hope) on the gpu??



> NVIDIA PhysX and APEX technologies are designed to *run on a variety of CPU architectures *and can be accelerated by any CUDA architecture-enabled NVIDIA GPU, GeForce 8-series or higher.



Edit: ok, W1zz had my exacty doubt and he's trying  get some serious answer...


----------



## Maban (Mar 7, 2013)

This was obviously left ambiguous for a reason. Sneaky, sneaky. There is no way that they would let it run on the GPU.


----------



## TheLaughingMan (Mar 7, 2013)

btarunr said:


> Proof that PhysX can be implemented on AMD Graphics CoreNext if NVIDIA really wanted. With a swelling list of PC games falling under AMD's various developer programs, the only way PhysX can survive is getting open.



I can but it will not be. This will be run on the CPU side of the PS4 which means it would work, but not at the level of detail as a Nvidia GPU with CUDA cores.

At least the 8 core CPU will serve a purpose now. I was wondering what they would do with the extra 4 cores during gaming besides download in the background.


----------



## vega22 (Mar 7, 2013)

physx was hacked to run on amd gpu back in the 4k days so we already knew it would work but at that time ati wouldnt offer offical support for an nvidia api.

idk if they can have much say whats ran on the hardware they are selling oem to sony.


----------



## nickbaldwin86 (Mar 7, 2013)

Prima.Vera said:


> Oh, the irony. So Physx on AMD hardware is more than possible, but not for PCs...



FALSE!!!

You ever tried search before making such false statements?
It is more than possible
You have to run a dedicated NV card but it is very very possible.

I have run it in many systems.

I had 2x5850s at one point with a dedicated NV GPU for physx. 

every time I have done it its more than a waste of time and a hassle, neither NV or ATI make it easy and the games that have it are meh games anyways.



The PS4 will no doubt be the same way... it will not run Physx on a AMD chip, Nvidia will have a chip in the PS4 just for the Physx to be processed on, just like dedicating a card to it.


----------



## SIGSEGV (Mar 7, 2013)

it's time for physx to die


----------



## Hillbilly (Mar 7, 2013)

I wonder what the new spin will be for not enabling PhysX on AMD hardware will be? Or could it be that Sony's contribution to the custom APU has some NVIDIA sauce in it? This is very interesting.


----------



## RejZoR (Mar 7, 2013)

Both AMD and NVIDIA should just cut the BS and stop releasing their own proprietary versions of physics. Because until they get that only ones at lose are the gamers and no one else.


----------



## TRWOV (Mar 7, 2013)

AMD doesn't have a proprietary physics engine


----------



## Prima.Vera (Mar 7, 2013)

nickbaldwin86 said:


> FALSE!!!
> 
> You ever tried search before making such false statements?
> It is more than possible
> ...



You on mushrooms?? :shadedshu


----------



## SIGSEGV (Mar 7, 2013)

RejZoR said:


> NVIDIA should just cut the BS and stop releasing their own proprietary versions of physics. Because until they get that only ones at lose are the gamers and no one else.



Fixed. 
AMD doesn't have proprietary physics processor. AMD is using directcompute to process their own physics (TressFX).


----------



## cadaveca (Mar 7, 2013)

nickbaldwin86 said:


> it will not run Physx on a AMD chip, Nvidia will have a chip in the PS4 just for the Physx to be processed on, just like dedicating a card to it.



Nvidia isn't a hardware company. They are now a software company, that also makes hardware. Hardware is no longer their primary focus.

http://www.xbitlabs.com/news/graphics/display/20091023135737_Nvidia_CEO_We_Are_Software_Company.html



			
				Jen Hsun Huang said:
			
		

> Nvidia is a software company, [software] will be what will drive our growth. We just don’t write the application, but we create all the core technology, we create more core technology for visual computing than any other company in the world. […] What you are and how you make money doesn’t have to be the same. […] Apple is a software company but they make money by selling hardware. […] Nvidia is an integrated complete visual computing and parallel computing solution technology company



Jen Hsun Huang should take his own advice though:



> But his advice to young entrepreneurs today took the form of a series of questions: Is this an important problem to solve? Are you the one to solve it? Are you more passionate about it than the competition? Are you more prepared?



http://blogs.nvidia.com/2012/05/nvidia-ceo-shakes-out-future-of-tech/


----------



## Mindweaver (Mar 7, 2013)

cadaveca said:


> Nvidia isn't a hardware company. They are now a software company, that also makes hardware. Hardware is no longer their primary focus.



I don't know Dave, I think there primary focus is on tegra, and then software. But there software is for there hardware.
*
EDIT: You edited.. hehehe I stand corrected. Thanks for the link.*


----------



## cadaveca (Mar 7, 2013)

Read the quote. I didn't say that....Nvidia's CEO and owner did. I'd not make any such claims without backup.


----------



## W1zzard (Mar 7, 2013)

Official response:



			
				NVIDIA said:
			
		

> Currently, most features in the PhysX SDK run only on the CPU, regardless of platforms. Certain features, such as particle systems and clothing, can be accelerated on a CUDA-capable GPU.  We will continue to study the feasibility of alternate implementations, and welcome any feedback from the developer community regarding the value of GPU-accelerated PhysX on all architectures.


----------



## cadaveca (Mar 7, 2013)

W1zzard said:


> Official response:



The funny thing about that response, to me, is that Sony highlighted porting physics calculation off of the CPU, and onto the GPU, in it's premiere(1 million physics objects, or something). What Nvidia is offering, seemingly, is the exact opposite.

Puzzling.

I smell a $3-$5 per-copy licensing fee for devs that use it.


----------



## Maban (Mar 7, 2013)

NVIDIA said:
			
		

> Currently, most features in the PhysX SDK run only on the CPU, regardless of platforms. Certain features, such as particle systems and clothing, can be accelerated on a CUDA-capable GPU. We will continue to study the feasibility of alternate implementations, and welcome any feedback from the developer community regarding the value of GPU-accelerated PhysX on all architectures.



IE: We will never let it be more than x86 and CUDA, but we will string you along so you believe it could happen soon.


----------



## Xzibit (Mar 7, 2013)

So all of it was just a headline grab.

Nvidia saying to consoles please dont forget us. 

Fits in to what BSN is saying about Nvidia



> Furthermore, in off-the-record discussions with the developers, we learned that Nvidia no longer invests as much in PC gaming developer teams as it used to, as Tegra is viewed as the main growth driver for the company. Naturally, this is purely one sided view, but a view coming from several game development companies which combined shipped over 100 million units.


----------



## TheMailMan78 (Mar 7, 2013)

Xzibit said:


> So all of it was just a headline grab.
> 
> Nvidia saying to consoles please dont forget us.
> 
> Fits in to what BSN is saying about Nvidia



Of course not. PC gaming is a dying market. Tablet, mobile, console is the only way to survive in their situation.


----------



## Fluffmeister (Mar 7, 2013)

I think people are just reading too much into this, PhysX was licensed and used on the current gen consoles and it will be no different with the new consoles.

It's just about giving your developers more options, nothing more nothing less.


----------



## cadaveca (Mar 7, 2013)

Fluffmeister said:


> It's just about giving your developers more options, nothing more nothing less.



And that's the main issue. Devs don't NEED options. That's the whole point of the console space, a closed platform. It's an act of desperation, really. CPU-based physics is like 1962 technology. Really. It's now 2013, 50 years later. That's why I quoted Jen Hsun's questions for entrepreneurs.. it's NOT the right way to do it, and Phys-X running NOW on GPUs proves it.

Nvidia must have hired some of AMD's old marketing team.


----------



## Eagleye (Mar 7, 2013)

*software development kits (SDKs)*

AMD are working closely with Havok, so this is being put out just in case some developer bites. We will see 1 or 2 titles a year in best case scenario.



> Nvidia will have a chip in the PS4 just for the Physx to be processed on, just like dedicating a card to it.



roflmao


----------



## Fluffmeister (Mar 7, 2013)

cadaveca said:


> And that's the main issue. Devs don't NEED options. That's the whole point of the console space, a closed platform. It's an act of desperation, really. CPU-based physics is like 1962 technology. Really. It's now 2013, 50 years later. That's why I quoted Jen Hsun's questions for entrepreneurs.. it's NOT the right way to do it, and Phys-X running NOW on GPUs proves it.



Of course they do, games use stacks of licensed technology regardless of the platform they are on. Lot's don't even use their own engine for example, why should they spend time and resources developing their own physics too when there are solutions already in place?

And if physics was all about the GPU these days then that actually puts PhysX in a strong position, Havok is all about the CPU and yet it just.... won't.... die.

PhysX is just another option, I get the NV- only GPU hate, but that doesn't make PhysX a less viable option than the others.


----------



## TheMailMan78 (Mar 7, 2013)

cadaveca said:


> And that's the main issue. Devs don't NEED options. That's the whole point of the console space, a closed platform. It's an act of desperation, really. CPU-based physics is like 1962 technology. Really. It's now 2013, 50 years later. That's why I quoted Jen Hsun's questions for entrepreneurs.. it's NOT the right way to do it, and Phys-X running NOW on GPUs proves it.
> 
> Nvidia must have hired some of AMD's old marketing team.



You can blame me. I went NVIDIA and ever since they have been having issues. I have the opposite of the Midas touch when it comes to GPU makers.


----------



## Casecutter (Mar 7, 2013)

SIGSEGV said:


> it's time for physx to die


Not so much die, just be an OpenCL. It's like Borderland that has PhysX not being able to be turned off. They made it to run on low, so it chew's CPU resources when not using a Nvidia cards.  :shadedshu


----------



## cadaveca (Mar 7, 2013)

Fluffmeister said:


> Of course they do, games use stacks of licensed technology regardless of the platform they are on. Lot's don't even use their own engine for example, why should they spend time and resources developing their own physics too when there are solutions already in place?
> 
> And if physics was all about the GPU these days then that actually puts PhysX in a strong position, Havok is all about the CPU and yet it just.... won't.... die.
> 
> PhysX is just another option, I get the NV- only GPU hate, but that doesn't make PhysX a less viable option than the others.



Nothing here counteracts my points posted above. PS4 is already using Havoc for GPU-based physics rendering, BTW, that was in the demo.

And I don't hate Nv at all...just this release.  I've posted countless times that I'd like to see them leverage their strength on software on other hardware. But not in this fashion, as otherwise, it's not going to offer anything more than what current consoles and PCs have. It's not like these consoles are some big power-house PCs...they are just beyond mid-level tech of TODAY. So what we see on high-end PCs, performance-wise, shows CPU-based Phys-X leaves a lot to be desired.


----------



## natr0n (Mar 7, 2013)

this is so odd

almost made me think amd didnt have any chips in a ps4


----------



## Fluffmeister (Mar 7, 2013)

cadaveca said:


> Nothing here counteracts my points posted above. PS4 is already using Havoc for GPU-based physics rendering, BTW, that was in the demo.
> 
> And I don't hate Nv at all...just this release.  I've posted countless times that I'd like to see them leverage their strength on software on other hardware. But not in this fashion, as otherwise, it's not going to offer anything more than what current consoles and PCs have. It's not like these consoles are some big power-house PCs...they are just beyond mid-level tech of TODAY. So what we see on high-end PCs, performance-wise, shows CPU-based Phys-X leaves a lot to be desired.



And nothing here contradicts my points either. Havok is just another option available, it doesn't mean every game will use it. And equally it's not like there are any stand out Havok based titles on the PC today. 

I guess that depends on whether those big meanies over at Intel want to plug their technology more of AMD and nVidia based hardware.


----------



## Mindweaver (Mar 7, 2013)

Let see the only games that will use physx on the next gen is Batman A:?, Boarderlands 3, and Mafia 3? Did I miss anything?


----------



## cadaveca (Mar 7, 2013)

Fluffmeister said:


> And nothing here contradicts my points either. Havok is just another option available, it doesn't mean every game will use it. And equally it's not like there are any stand out Havok based titles on the PC today.
> 
> I guess that depends on whether those big meanies over at Intel want to plug their technology more of AMD and nVidia based hardware.




I disagree with your overall sentiment. But that's fine.


Intel is pushing software on competitor's GPU hardware, with Havok. That's what I expect of NVidia, and that's all.


----------



## Xzibit (Mar 7, 2013)

Fluffmeister said:


> And nothing here contradicts my points either. Havok is just another option available, it doesn't mean every game will use it. And equally it's not like there are any stand out Havok based titles on the PC today.
> 
> I guess that depends on whether those big meanies over at Intel want to plug their technology more of AMD and nVidia based hardware.



Hmm..

Havok titles


----------



## Fluffmeister (Mar 7, 2013)

Xzibit said:


> Hmm..
> 
> Havok titles



Hmm..

http://physxinfo.com/


----------



## erocker (Mar 7, 2013)

W1zzard said:


> Official response:



I wasn't expecting that answer at all. Promising!


----------



## NdMk2o1o (Mar 7, 2013)

Fluffmeister said:


> Hmm..
> 
> http://physxinfo.com/



HMMMMMMMMMMMMMMMMMMMMMMMM...

http://www.havok.com/customer-projects/games/other-titles?items_per_page=All&=Apply


----------



## Fluffmeister (Mar 7, 2013)

cadaveca said:


> I disagree with your overall sentiment. But that's fine.
> 
> 
> Intel is pushing software on competitor's GPU hardware, with Havok. That's what I expect of NVidia, and that's all.



That's fine, but they have literally only posted a single video, if only Nvidia could enjoy as much benefit of the doubt. 

It will be interesting if Havok based GPU acceleration eventually comes to the PC and how it effects things.


----------



## OneCool (Mar 7, 2013)

cadaveca said:


> I smell a $3-$5 per-copy licensing fee for devs that use it.




Exactly!


----------



## Fluffmeister (Mar 7, 2013)

NdMk2o1o said:


> HMMMMMMMMMMMMMMMMMMMMMMMM...
> 
> http://www.havok.com/customer-projects/games/other-titles?items_per_page=All&=Apply



Not sure what point your making beyond what Xzibit posted? It's widely used across multiple platforms, so is PhysX.


----------



## Xzibit (Mar 7, 2013)

Fluffmeister said:


> Hmm..
> 
> http://physxinfo.com/



And you notice all those games have in common.  All PC.  

If you've been pushing PhysX for 5 years and only can get it to less then 100 games. When your comparing it to Havok which is in over 500 titles over several platforms.



Too much green tea in your diet


----------



## erocker (Mar 7, 2013)

Fluffmeister said:


> I think people are just reading too much into this, PhysX was licensed and used on the current gen consoles and it will be no different with the new consoles.
> 
> It's just about giving your developers more options, nothing more nothing less.



You missed this post then...



			
				Nvidia said:
			
		

> Currently, most features in the PhysX SDK run only on the CPU, regardless of platforms. Certain features, such as particle systems and clothing, can be accelerated on a CUDA-capable GPU. We will continue to study the feasibility of alternate implementations, and welcome any feedback from the developer community regarding the value of GPU-accelerated PhysX on all architectures.



I have no idea why you're bringing up Havok. Nothing to do with this.


----------



## okidna (Mar 7, 2013)

Xzibit said:


> And you notice all those games have in common.  All PC.
> 
> If you've been pushing PhysX for 5 years and only can get it to less then 100 games. When your comparing it to Havok which is in over 500 titles over several platforms.
> 
> ...



"less than 100 games"? Try 429 games : http://physxinfo.com/index.php?p=gam&f=all


----------



## Fluffmeister (Mar 7, 2013)

Xzibit said:


> And you notice all those games have in common.  All PC.
> 
> If you've been pushing PhysX for 5 years and only can get it to less then 100 games. When your comparing it to Havok which is in over 500 titles over several platforms.
> 
> ...



All PC? Did you even look at the list?

I count 216 Havok titles on their official site, I'm interested in the seeing the link showing 500+ if you can post it?



erocker said:


> You missed this post then...
> 
> 
> 
> I have no idea why you're bringing up Havok. Nothing to do with this.



I missed nothing, it's pretty much a given it will run on the CPU, whether devs use it it not is their choice. I brought up Havok because it's pretty much the only viable reference, sorry about that.


----------



## Xzibit (Mar 7, 2013)

okidna said:


> "less than 100 games"? Try 429 games : http://physxinfo.com/index.php?p=gam&f=all



I did try and I could only count 100 or so

And thats because Nvidia is still including Ageia titles that were in development prior to 2008 when Nvidia bought them.  Back then PhysX was only running on CPU and Ageia was making a push to offload the workload to a PPU.



Fluffmeister said:


> All PC? Did you even look at the list?
> 
> I count 216 Havok titles on their official site, I'm interested in the seeing the link showing 500+ if you can post it?



I'm going by whats on there site. Not going by a 3rd party website.  E-mail them for the list and share it 



> Havok has over 13 years of experience servicing the most demanding technical requirements for leading customers in the commercial games and entertainment industry. Havok’s combination of superior technology and dedication to delivering industry leading support to its customers has led to the company’s technologies being used in over *500* of the best known and award-winning titles including Halo 4, Skylander's Giants, Assassin’s Creed III, Guild Wars 2, Uncharted: Golden Abyss™ and Darksiders II.


----------



## TheoneandonlyMrK (Mar 7, 2013)

Maban said:


> IE: We will never let it be more than x86 and CUDA, but we will string you along so you believe it could happen soon.



Exactly,  like they ever get a dev asking for more walls, less end users and less end user satisfaction. 
Im quite enraged by this bull pr as ive a hybrid physx setup just for batman obv and its a pain in the ass , they shouldn't oughta have done that. 
And reizor your deluded,  only nvidia make progress stopping standards that only work on nv every one else actually tries to work together to make our(end users) life easier better and more fun.
Sinister licence waveing BS......


----------



## okidna (Mar 7, 2013)

Xzibit said:


> I did try and I could only count 100 or so
> 
> And thats because Nvidia is still including Ageia titles that were in development prior to 2008 when Nvidia bought them.  Back then PhysX was only running on CPU and Ageia was making a push to offload the workload to a PPU.



*facepalm*
That's an extremely outdated list. 

You can't even find Alice: Madness Returns, Borderlands 2, Batman Arkham Asylum and Arkham City, Mafia II, Mirror's Edge, Hawken, and all the PhysX game titles which released after 2010.

And you still can find Heavy Rain as a PC title in that list


----------



## TheMailMan78 (Mar 7, 2013)

okidna said:


> *facepalm*
> That's an extremely outdated list.
> 
> You can't even find Alice: Madness Returns, Borderlands 2, Batman Arkham Asylum and Arkham City, Mafia II, Mirror's Edge, Hawken, and all the PhysX game titles which released after 2010.
> ...



Is Metro 2033 on there?


----------



## tokyoduong (Mar 7, 2013)

PhysX is such a waste of time. I really liked the idea of a cheap PPU. I wished Aegia didn't sell its soul. I was really thinking of buying one until NVIDIA picked it up. Why don't they just make a small chip that can be integrated into any graphics card and just charge a small royalty fee. I just can't see how NVIDIA can win with their current policy. 
With all these new open standards, the BS propriety stuff can only be forced with lots of money. Something that NVIDIA doesn't have a lot of.


----------



## okidna (Mar 7, 2013)

TheMailMan78 said:


> Is Metro 2033 on there?



Yes, and probably the latest/newest game (2010) in that list.


----------



## TheMailMan78 (Mar 7, 2013)

tokyoduong said:


> PhysX is such a waste of time. I really liked the idea of a cheap PPU. I wished Aegia didn't sell its soul. I was really thinking of buying one until NVIDIA picked it up. Why don't they just make a small chip that can be integrated into any graphics card and just charge a small royalty fee. I just can't see how NVIDIA can win with their current policy.
> With all these new open standards, the BS propriety stuff can only be forced with lots of money. Something that NVIDIA doesn't have a lot of.



I like PhysX. Adds a lot to games when its hardware accelerated.


----------



## TheLaughingMan (Mar 7, 2013)

Fluffmeister said:


> Not sure what point your making beyond what Xzibit posted? It's widely used across multiple platforms, so is PhysX.



PhysX actually isn't your post proved that with a total of what, 11 games that use it.


----------



## Mindweaver (Mar 7, 2013)

TheMailMan78 said:


> I like PhysX. Adds a lot to games when its hardware accelerated.



I have to agree. I like PhysX and is the main reason I just bought a GTX680. 
*
EDIT: Did I tell you I just bought a MSI GTX680 PE? It's out for Delivery. *


----------



## HumanSmoke (Mar 7, 2013)

Can't say the press release was anything other than 1. expected, and 2. standard boilerplate.

The Unreal engine features prominently in PS4 PR...and Unreal features what physics engine ?...and Nvidia have been listed as a PS4 partner for some time- if they aren't providing hardware for the system, what else would their partnership be offering ?


TheLaughingMan said:


> PhysX actually isn't your post proved that with a total of what, 11 games that use it.


Well considering the topic at hand is PS4, then you'd probably need to include console games as well- and since Unreal has PhysX integrated, maybe those eleven titles (????????) need augmenting


----------



## tokyoduong (Mar 7, 2013)

TheMailMan78 said:


> I like PhysX. Adds a lot to games when its hardware accelerated.



Yes it does. I don't doubt that. I've seen it in Batman and Mirror's Edge as best use of PhysX. It's a big difference but not exactly a make or break deal in almost all titles. And the problem is  that those titles where it worked great was single player mainly. Why don't make physX an open standard but spend some money optimizing it for their CUDA cores. Yes, it will run on all GPUs but it will be best on NVIDIA cards. That way, I can get a decent taste of physX with whatever GPU i have and decide for myself whether i should plunge for NVIDIA or not on my next card purchase.

The fact that PhysX is not a deciding factor for gamers buying a new card means their strategy is not working. The fact that physX is not that important for gamers means that developers will not go heavy on implementing PhysX unless they have an incentive payment from TWIMTBP.


----------



## TheMailMan78 (Mar 7, 2013)

tokyoduong said:


> Yes it does. I don't doubt that. I've seen it in Batman and Mirror's Edge as best use of PhysX. It's a big difference but not exactly a make or break deal in almost all titles. And the problem is  that those titles where it worked great was single player mainly. Why don't make physX an open standard but spend some money optimizing it for their CUDA cores. Yes, it will run on all GPUs but it will be best on NVIDIA cards. That way, I can get a decent taste of physX with whatever GPU i have and decide for myself whether i should plunge for NVIDIA or not on my next card purchase.
> 
> The fact that PhysX is not a deciding factor for gamers buying a new card means their strategy is not working. The fact that physX is not that important for gamers means that developers will not go heavy on implementing PhysX unless they have an incentive payment from TWIMTBP.



AMD drivers are an incentive to go NVIDIA. PhysX is just gravy.


----------



## tokyoduong (Mar 7, 2013)

TheMailMan78 said:


> AMD drivers are an incentive to go NVIDIA. PhysX is just gravy.



you have to bring another flame bait into this topic. I'm out.


----------



## TheMailMan78 (Mar 7, 2013)

tokyoduong said:


> you have to bring another flame bait into this topic. I'm out.



Just going by my own experience.


----------



## Fluffmeister (Mar 7, 2013)

TheLaughingMan said:


> PhysX actually isn't your post proved that with a total of what, 11 games that use it.


----------



## cadaveca (Mar 7, 2013)

tokyoduong said:


> you have to bring another flame bait into this topic. I'm out.



It's not flamebait. It's documented issues with Photoshop(OpenGL), which made him change.


I have issues myself, and am considering changing as well. It's not flamebait at all, it's just a users experience, and sentiment.

I haven't run Nv cards for any real length of time since the 7800GTX 256MB. I've had Nvidia cards since then, but have always preffered AMD cards. I am loath to switch, and just end up with other issues, but I am definitely not happy with the dual 7950's and dual 7970's that I run now, to the point that of those 4 VGAs, only one is getting used. I'd sell the others, but have some review work to finish with them, and then I'm changing as well. AMD's drivers suck in my config. You're more than welcome to come to my house and check it out, find out what I'm doing wrong, but I doubt you'll find much.

So you can add THAT to my "ANTI-NV" opinions posted earlier.



Fluffmeister said:


> http://farm4.staticflickr.com/3653/3390182310_f86c82cb95.jpg



Remember, he plays games near daily with the rest of the TS users, and uses an Nvidia card. To him, that is his perspective as an NVidia user. You may not like it, but that doesn't make it invalid. Whenever he sees something in-game that is cool, and thinks might be pHys-X, he's quick to ask me if I see similar(we paly together often). 8/10 times, I do. So Phys-X adds little to his experience so far.


If PHys-X is used behind the scenes, nobody cares. But the fact of the matter is that while it is used fairly often now, it doesn't really add much that a user can identify with, as that's 1000% due to it being CPU-based. When it doesn't add much to the user experience, of course it's usefulness will be questioned.


----------



## tokyoduong (Mar 7, 2013)

cadaveca said:


> It's not flamebait. It's documented issues with Photoshop(OpenGL), which made him change.
> 
> 
> I have issues myself, and am considering changing as well. It's not flamebait at all, it's just a users experience, and sentiment.
> ...



But we're talking about PhysX and not photoshop. I agree with you that NVIDIA cards works better with photoshop anyways. The gap is not as wide now as it used to be.

As far as gaming and physX goes(since they are closely/directly related), NVIDIA has made inconsistency and dirty tactics in the industry. I think it's great that they help devs optimize games to work with their hardware but they also do dirty tricks to make AMD look bad for no reason. If I am owning AMD hardware and the game runs like crap simply because of a bad business practice rather than hardware limitation then I would less likely buy into their products. 

Just open up PhysX and put NVIDIA logo in it so it's pretty much repetitive marketing burned into everyone's brain whenever they start a game. Let it run on all hardware. Spend money to optimize on their CUDA cores. Devs will spend more time and resources coding for physX into their game. Watch people enjoy PhysX so much that they would rather buy NVIDIA cards instead of AMD because it just works better. That would make more sense as a long term strategy.


----------



## Fluffmeister (Mar 7, 2013)

cadaveca said:


> Remember, he plays games near daily with the rest of the TS users, and uses an Nvidia card. To him, that is his perspective as an NVidia user. You may not like it, but that doesn't make it invalid. Whenever he sees something in-game that is cool, and thinks might be pHys-X, he's quick to ask me if I see similar(we paly together often). 8/10 times, I do. So Phys-X adds little to his experience so far.
> 
> 
> If PHys-X is used behind the scenes, nobody cares. But the fact of the matter is that while it is used fairly often now, it doesn't really add much that a user can identify with, as that's 1000% due to it being CPU-based. When it doesn't add much to the user experience, of course it's usefulness will be questioned.



You're right, I can't help it if people are going to be ignorant.


----------



## cadaveca (Mar 7, 2013)

Fluffmeister said:


> You're right, I can't help it if people are going to be ignorant.



Actually, I'd call your own attitude a bit ignorant as well. Like, no offense intended, but really that's how you come off, especially with this post as a perfect example. Ignorance is bliss, so they say. Why try to ruin someone's bliss?


Do keep in mind, that the local BoyScout troop is run out of my house, and as such, those BoyScout ideals are a big part of my character that has me feel this way. I dealt with Mailman and his driver issues, on many occasions, and he really is justified in making that change, and stating it as such.



tokyoduong said:


> But we're talking about PhysX and not photoshop. I agree with you that NVIDIA cards works better with photoshop anyways. The gap is not as wide now as it used to be.



Well, I mean, Mailman posted AMD had driver issues, which was scoffed at. For him, there were issues, and he uses his PC both for gaming, and for Photoshop. He wasn't referring to issues with games. you called it flamebait, but that was really what made him change...driver issues.


----------



## TheHunter (Mar 7, 2013)

HW physx can be good, but its optimization sucks.

HW Physx3 has improved a little, but nothing major compared to physx2, it still crawls gpu with useless cycles (drawback calls) and lowers gpu usage instead of speeding it up.. 


I know i played a lot HW physx games and the more i played them the more i hatted this physics engine. Again optimization is bad.


Sw physx is ok, its just another physics engine, nothing special.


----------



## Fluffmeister (Mar 7, 2013)

cadaveca said:


> Actually, I'd call your own attitude a bit ignorant as well. Like, no offense intended, but really that's how you come off, especially with this post as a perfect example. Ignorance is bliss, so they say. Why try to ruin someone's bliss?
> 
> 
> Do keep in mind, that the local BoyScout troop is run out of my house, and as such, those BoyScout ideals are a big part of my character that has me feel this way. I dealt with Mailman and his driver issues, on many occasions, and he really is justified in making that change, and stating it as such.



I really can't help it if people get offended, this forum and many others like it are full of people getting butthurt everyday over things that basically mean very little in the grand scheme of things.

The fact is I simply stated it's no real surprise PhysX has been licensed again for the new consoles, which in turn resulted in all this fuss is suitable proof of that.


----------



## tokyoduong (Mar 7, 2013)

cadaveca said:


> Well, I mean, Mailman posted AMD had driver issues, which was scoffed at. For him, there were issues, and he uses his PC both for gaming, and for Photoshop. He wasn't referring to issues with games. you called it flamebait, but that was really what made him change...driver issues.



I never scoffed at him about PhysX. I even said I PhysX works great but photoshop is off topic and a subjective comment about drivers is unnecessary. But I see his point, however, it is still off topic. 

My point being they have this technology that should be mainstream and they spent lots of time and resources to push and make it mainstream. Then they spent lots of time and resources restricting people from experiencing it. It's like an identity crisis. 

Now if they allow physx to run on the PS4 GPU then they're hypocritical. If they force it to run on the CPU then it's like what you mentioned above. The difference in gaming experience will not be that noticeable if it's even noticed. The cost will get passed to the consumer one way or another with a half assed implementation of something that could/should work great on every title. AMD users will be pissed and NVIDIA users doesn't get to fully utilize PhysX iin all its glory because devs doesn't want to heavily code something that NVIDIA shut out half their potential customers.



Fluffmeister said:


> I really can't help it if people get offended, this forum and many others like it are full of people getting butthurt everyday over things that basically mean very little in the grand scheme of things.
> 
> The fact is I simply stated it's no real surprise PhysX has been licensed again for the new consoles, which in turn resulted in all this fuss is suitable proof of that.



But you posted this after you read the news. Had you predicted this before the news came out then you would've had a great career as a prophet


----------



## TheMailMan78 (Mar 7, 2013)

tokyoduong said:


> I never scoffed at him about PhysX. I even said I PhysX works great but photoshop is off topic and a subjective comment about drivers is unnecessary. But I see his point, however, it is still off topic.



I was giving you a reason IMO that is valid to switch even without PhysX which was based off of this comment.



tokyoduong said:


> Yes, it will run on all GPUs but it will be best on NVIDIA cards. That way, I can get a decent taste of physX with whatever GPU i have and *decide for myself whether i should plunge for NVIDIA or not on my next card purchase*.



Also NVIDIA bought PhysX as a fluff for developers to use their tech and license. They spent good money on it just so they could win a profit. Now you want them to make it "open source". Great idea there Buffet.


----------



## Fluffmeister (Mar 7, 2013)

tokyoduong said:


> But you posted this after you read the news. Had you predicted this before the news came out then you would've had a great career as a prophet



I said people were reading too much into it too, but you're right, I should have predicted the impending tears.


----------



## erocker (Mar 7, 2013)

Fluffmeister said:


> I said people were reading too much into it too, but you're right, I should have predicted the impending tears.



You can stop now. No need to continue down this road. You've said what you need to say, no point in continuing. Let others partake in the discussion.


----------



## buildzoid (Mar 7, 2013)

Nvidia would never let AMD run Physx without some sort of forced slow down because AMD cards have a lot more compute power than Nvidia cards so they would run PhysX better.
GTX 680 ~ 3TFLOPS
HD 7970 Ghz  ~ 4TFLOPS
GTX Titan ~ 4.5TFLOPS
So if PhysX was equally well optimized for both AMD and Nvidia GPUs then the AMD ones would run it better. The only nvidia card that has more compute power than the 7970 Ghz is the GTX Titan which is to expensive to compete in this.


----------



## newtekie1 (Mar 7, 2013)

buildzoid said:


> Nvidia would never let AMD run Physx without some sort of forced slow down because AMD cards have a lot more compute power than Nvidia cards so they would run PhysX better.
> GTX 680 ~ 3TFLOPS
> HD 7970 Ghz  ~ 4TFLOPS
> GTX Titan ~ 4.5TFLOPS
> So if PhysX was equally well optimized for both AMD and Nvidia GPUs then the AMD ones would run it better. The only nvidia card that has more compute power than the 7970 Ghz is the GTX Titan which is to expensive to compete in this.



The tiny amount of compute power PhysX needs to run would make the difference between the GPUs moot.  I mean, we're talking about a technology that can be maxed out on a GTX250 with is only capable of ~450GFLOPS.  The PhysX calculations are relatively easy for a GPU to compute, it is all the extra graphical crap that has to be rendered on the screen that it generates that really cause the performance issues.


----------



## buildzoid (Mar 7, 2013)

Yeah because so far it's implementations are limited to aesthetics if it was use for highly detailed environment destruction (Redfaction style but much bigger something like leveling a skyscraper/s) or something similar then the requirements would start stacking up.


----------



## TheoneandonlyMrK (Mar 7, 2013)

buildzoid said:


> Yeah because so far it's implementations are limited to aesthetics if it was use for highly detailed environment destruction (Redfaction style but much bigger something like leveling a skyscraper/s) or something similar then the requirements would start stacking up.



Bring exactly that.... someone please xD


----------



## Phusius (Mar 7, 2013)

Do we even want Physx?  When I tried out a GTX 680 Arkham City generally ran fine but when Physx was enable it still dipped into 20's and 30's FPS at times... and that was a 680...

You pretty much have to have a separate card for Physx to enable it on High in games like Arkham City... so I dunno.


----------



## Bunchies (Mar 8, 2013)

isnt the ps4 using amd hardware? wait wtf why do i even care lol

just makes no sense


----------



## newtekie1 (Mar 8, 2013)

buildzoid said:


> Yeah because so far it's implementations are limited to aesthetics if it was use for highly detailed environment destruction (Redfaction style but much bigger something like leveling a skyscraper/s) or something similar then the requirements would start stacking up.



Yeah, but your still not even coming close to approaching the 3TFLOPS the GTX680 is capable of pumping out, and in that scenario the amount of crap on the screen that would need to be rendered would bring any graphics setup to its knees so the PhysX calculations would still be a minor part.




Phusius said:


> Do we even want Physx?  When I tried out a GTX 680 Arkham City generally ran fine but when Physx was enable it still dipped into 20's and 30's FPS at times... and that was a 680...
> 
> You pretty much have to have a separate card for Physx to enable it on High in games like Arkham City... so I dunno.




I'd love to see both major next gen consoles supporting it, perhaps then we'd see developers actually start to use it for more than just useless smoke effects.

But in the end we're only going to see CPU driven PhysX, so meh...

And the performance issues, at least on my GTX670 were graphical, not actually caused by PhysX.  There is just so much extra shit on the screen that has to be rendered it bogs down the card.  Dropping a dedicated GTX470 GPU into my system for PhysX didn't help.


----------



## Fiendish (Mar 8, 2013)

It seems some people need to go back and learn the actual history behind PhysX, it might help clarify things. http://physxinfo.com/wiki/Main_Page

It should also be remembered that when Nvidia created CUDA, their competitor ATI, had already created and were pushing their own proprietary standard, Close-To-Metal.


----------



## [H]@RD5TUFF (Mar 8, 2013)

Doesn't matter the PS4 will still blow fat donkey dicks


----------



## NeoXF (Mar 8, 2013)

Holy crap, I find it amazing that you people still refuse to call the PS4 hardware what it is... an *APU*... meaning whatever the CPU does, the GPU can muscle it up and accelerate it. There is no "CPU and GPU", there is "APU".

There is no spoon


Edit: God this thread suuuuuuuuuuuuucks...


----------



## oNyX (Mar 8, 2013)

You mean to tell me that the Asus ARES II cannot render Mafia 2's attic dust and pre-rendered rocks that fall on the floor when you shoot a wall Just because it doesn't support PhysX? 

I have two Nvidia cards and not once I've really needed PhysX. I still prefer CPU based physics and/or engines like Havok. When you're high on Monster energy drinks at a LAN or playing online with your console, nobody cares about PhysX. Unless some kid high on Rockstar energy drinks happens to mention it. You'll probably get a few guys high on SCORE engergy drinks ignorantly screaming "they chose Nvidia because it has PhysX" or "I chose Xbox 360 because it's got GOW3." Well congrats to you buddy. 

By the way. TressFX isn't physics like Nvidia's PhysX. It's just real-time hair and foilage rendering, the rest could be Havok powered. More and more games are falling under the AMD banner and some that used to be Nvidia, so it's clear that Nvidia is turning their attention away from the Gaming Industry and more towards them floor tiles known as tablets.

I've been a Nvidia user, but not for long. My next upgrade WILL be AMD Radeon. I have reasons other than PhysX-free, fanboynism or ignorance why I chose AMD.


----------



## Rebel333 (Mar 8, 2013)

What Nvidia want with Phisiyks on PS4? If I know well there is no a little piece of Nvidia hardware in it.


----------



## arnoo1 (Mar 8, 2013)

Wtf physx on amd hardware

I don't wanna live on this planet anymore xd


----------



## Prima.Vera (Mar 8, 2013)

arnoo1 said:


> Wtf physx on amd hardware
> 
> I don't wanna live on this planet anymore xd



On the contrary. Life is getting more and more interesting!


----------



## TheLaughingMan (Mar 8, 2013)

NeoXF said:


> Holy crap, I find it amazing that you people still refuse to call the PS4 hardware what it is... an *APU*... meaning whatever the CPU does, the GPU can muscle it up and accelerate it. There is no "CPU and GPU", there is "APU".
> 
> There is no spoon
> 
> Edit: God this thread suuuuuuuuuuuuucks...



If your only comment is the thread sucks, then why are you reading it? And that is not even remotely close to how GPU acceleration works. While AMD does want to move toward of a heterogeneous processing system, we are far far away from that being a possibility. So why a single chip is housing a GPU and CPU, there are still separate in function.



Fiendish said:


> It seems some people need to go back and learn the actual history behind PhysX, it might help clarify things. http://physxinfo.com/wiki/Main_Page
> 
> It should also be remembered that when Nvidia created CUDA, their competitor ATI, had already created and were pushing their own proprietary standard, Close-To-Metal.



They were working on their own proprietary standard for physics calculations. A project that was never completed as AMD decided to instead support working on a universal physics system close to metal through open standards such as OpenCL. A noble move that to this day still has not produced universal anything which is why everyone is still using Havok and PhysX were needed.



Bunchies said:


> isnt the ps4 using amd hardware? wait wtf why do i even care lol
> 
> just makes no sense



You care because this is now an x86-64 based system like your computer. Using AMD designed hardware and DirectX 11 +. This means what runs on PS4, runs on PC with minor tweaks to expand CPU/GPU hardware support. This will make porting games between console and PC far easier for developers leaving them no reason not to do so with brand exclusives being the exception to that rule.

You care because this will result in our PC games (which are mostly ported from console) looking better, being less buggy, and give us more titles.


----------



## newtekie1 (Mar 8, 2013)

NeoXF said:


> Holy crap, I find it amazing that you people still refuse to call the PS4 hardware what it is... an *APU*... meaning whatever the CPU does, the GPU can muscle it up and accelerate it. There is no "CPU and GPU", there is "APU".
> 
> There is no spoon
> 
> ...



I find it amazing that people have no clue how an APU works.  Just because they put the GPU on the same die as the CPU doesn't mean that the GPU can magically start doing the same work as a CPU.  They are still two very different pieces of hardware that operate in two very different ways, even if they are on the same piece of silicon.


----------



## TheoneandonlyMrK (Mar 8, 2013)

newtekie1 said:


> I find it amazing that people have no clue how an APU works.  Just because they put the GPU on the same die as the CPU doesn't mean that the GPU can magically start doing the same work as a CPU.  They are still two very different pieces of hardware that operate in two very different ways, even if they are on the same piece of silicon.


Whist I appreciate what your saying and agree in principle when applied to pc Apu's, I don't think it a sound statement when this is a next gen Apu with universal Imc and memory, you can't say with certainty how sony and devs will use it, this could well be the true beginning of the Hsa revolution,  after all sony isnt tied to using the hardware in the same way pcs do or pc os and api's, bit early to say how much the gpu might generally 
Be used, you might note ,, Sony are a HSa contributer and there is still a founder spot free or maybe it isn't.


----------



## tokyoduong (Mar 8, 2013)

theoneandonlymrk said:


> Whist I appreciate what your saying and agree in principle when applied to pc Apu's, I don't think it a sound statement when this is a next gen Apu with universal Imc and memory, you can't say with certainty how sony and devs will use it, this could well be the true beginning of the Hsa revolution,  after all sony isnt tied to using the hardware in the same way pcs do or pc os and api's, bit early to say how much the gpu might generally
> Be used, you might note ,, Sony are a HSa contributer and there is still a founder spot free or maybe it isn't.



GPU is always better at massively parallel and CPU is better at low latency, highly random, out of order instructions. You cannot change the physical design of a chip with software layers. Sony can dramatically optimize their codes and compiler to this one specific design but you will see that it will work the same way as it is currently working on the PC. The only definite conclusion we can have right now is that the console version will be much more efficient since it's a fixed spec.


----------



## TheoneandonlyMrK (Mar 8, 2013)

tokyoduong said:


> GPU is always better at massively parallel and CPU is better at low latency, highly random, out of order instructions. You cannot change the physical design of a chip with software layers. Sony can dramatically optimize their codes and compiler to this one specific design but you will see that it will work the same way as it is currently working on the PC. The only definite conclusion we can have right now is that the console version will be much more efficient since it's a fixed spec.



Hsa is all about the integrated use of what's there , the key is unified memory and the whole point is to use what's best at the job , which means exactly that and not moving all the work to the gpu, just what its good at, easily possible with what they have.
Oh and this one specific design is not so dissimilar from the phone you might own soon , some of the biggest players in mobile chips are into Hsa as much as sony and amd, don't be blind to what agwan


----------



## TheHunter (Mar 8, 2013)

Why are people still confused by this announcement?
*
Its is only for PS4 CPU SDK..*


----------



## TheoneandonlyMrK (Mar 8, 2013)

TheHunter said:


> Why are people still confused by this announcement?
> *
> Its is only for PS4 CPU SDK..*


No its Pr spin and Remember us man ship


----------



## newtekie1 (Mar 8, 2013)

theoneandonlymrk said:


> Whist I appreciate what your saying and agree in principle when applied to pc Apu's, I don't think it a sound statement when this is a next gen Apu with universal Imc and memory, you can't say with certainty how sony and devs will use it, this could well be the true beginning of the Hsa revolution,  after all sony isnt tied to using the hardware in the same way pcs do or pc os and api's, bit early to say how much the gpu might generally
> Be used, you might note ,, Sony are a HSa contributer and there is still a founder spot free or maybe it isn't.



Next gen APU with universal IMC and memory?  That is how the current APUs work.  There is one IMC that controls the system memory, and the GPU and CPU access what they need.  The only thing new here is that the IMC is a GDDR5 controller instead of DDR3.

There is nothing revolutionary here that will suddenly allow a GPU to do work coded for a CPU.  When the CPU load gets too high the GPU can't just step in and help out like NeoFX is claiming.


----------



## Slizzo (Mar 8, 2013)

People need to remember that PhysX is a software AND hardware solution. When running on an AMD system, the game is still using a PhysX physics engine. Much like any game that uses Havok, it's a software physics engine.


----------



## Prima.Vera (Mar 9, 2013)

Slizzo said:


> People need to remember that PhysX is a software AND hardware solution. When running on an AMD system, the game is still using a PhysX physics engine. Much like any game that uses Havok, it's a software physics engine.



While I agree with your statement, let's not forget that if you run Physx in software mode, you have to be prepared for some serious fps drop. Sure you can run it with any hardware acceleration, but something tells me is more then than with PS4.


----------



## Fluffmeister (Mar 9, 2013)

Slizzo said:


> People need to remember that PhysX is a software AND hardware solution. When running on an AMD system, the game is still using a PhysX physics engine. Much like any game that uses Havok, it's a software physics engine.



Exactly, at least someone gets it.


----------



## newtekie1 (Mar 9, 2013)

Prima.Vera said:


> While I agree with your statement, let's not forget that if you run Physx in software mode, you have to be prepared for some serious fps drop. Sure you can run it with any hardware acceleration, but something tells me is more then than with PS4.



No.  You are still getting things confused.  PhysX offers both a software and hardware solution to developers.  The software solution offers a lower level of effects, it basically brings PhysX down to what Havok can do.  When the developers use the software only version it runs on the CPU just fine.  All games can use this mode, even ones that have the hardware accelerated solution, when a game coded with the hardware solution in mind doesn't detect an nVidia PhysX capable GPU, the game will drop down to the software mode.  That is why hardware accelerated PhysX games are still completely playable on AMD hardware, the game just uses software mode.  There is no FPS impact here.  There are about 390 games that use the PhysX SDK, most of them use the software version only.

What you are talking about is the hardware solution being forced to run on the CPU, which nVidia gives you the option to do in their control panel, in that case performance is crap.  But that isn't what software PhysX is.


----------



## TheoneandonlyMrK (Mar 9, 2013)

newtekie1 said:


> Next gen APU with universal IMC and memory?  That is how the current APUs work.  There is one IMC that controls the system memory, and the GPU and CPU access what they need.  The only thing new here is that the IMC is a GDDR5 controller instead of DDR3.
> 
> There is nothing revolutionary here that will suddenly allow a GPU to do work coded for a CPU.  When the CPU load gets too high the GPU can't just step in and help out like NeoFX is claiming.



A cpu does what it does , a gpu can do what it does this we agree and needs no further mention. 
Im saying the os,  and software that sony chooses to build/use might well use the gpu (where appropriate) more than is presently done  .  ....... how the heck can you say that's wrong do you work for sony if not you cant know. 
And the revolution is via software and new instruction sets and languages made possible with slight but important hardware changes that allow them. Again something that's too new to dissmiss.
I don't need a lesson on physx or its use as ive investigated it fully despite xfire amd gfx as I have also looked into hsa etc.
Any and most chips that are made have circuits functions and whole instruction sets in them for use but that are fused off , they do this to allow multiple end uses and to pre test yield effects from new tech whos to say whats in there sony still haven't disclosed every spec imho.


----------



## Bjorn_Of_Iceland (Mar 9, 2013)

cadaveca said:


> The funny thing about that response, to me, is that Sony highlighted porting physics calculation off of the CPU, and onto the GPU, in it's premiere(1 million physics objects, or something). What Nvidia is offering, seemingly, is the exact opposite.
> 
> Puzzling.
> 
> I smell a $3-$5 per-copy licensing fee for devs that use it.


Or, 2 cents for every in game pebble.



Phusius said:


> Do we even want Physx?  When I tried out a GTX 680 Arkham City generally ran fine but when Physx was enable it still dipped into 20's and 30's FPS at times... and that was a 680...
> 
> You pretty much have to have a separate card for Physx to enable it on High in games like Arkham City... so I dunno.


TressFX hogs the framerates as well *even* on an AMD gpu.. not to mention, that the performance penalty is very big given that it is only Lara's hair being rendered. I'd say we are pretty much stuck to those numbers in terms of framerate degradation as of now (well.. if they wanted to have persistent pebbles and floaty hair). Well at least nVidia gives an option to have a dedicated card to offload computing.. for physx that is.


----------



## newtekie1 (Mar 9, 2013)

theoneandonlymrk said:


> A cpu does what it does , a gpu can do what it does this we agree and needs no further mention.
> Im saying the os,  and software that sony chooses to build/use might well use the gpu (where appropriate) more than is presently done  .  ....... how the heck can you say that's wrong do you work for sony if not you cant know.
> And the revolution is via software and new instruction sets and languages made possible with slight but important hardware changes that allow them. Again something that's too new to dissmiss.
> I don't need a lesson on physx or its use as ive investigated it fully despite xfire amd gfx as I have also looked into hsa etc.
> Any and most chips that are made have circuits functions and whole instruction sets in them for use but that are fused off , they do this to allow multiple end uses and to pre test yield effects from new tech whos to say whats in there sony still haven't disclosed every spec imho.



Ok, I'll say it again, the GPU can't just magically do work designed for a CPU.  That is what NeoFX is claiming.  

Yes, Sony could program the OS to be more GPU hardware accelerated.  And the game developers could leverage the GPU to do more work than just graphics rendering(but that would be stupid).  However, the software would have to be designed to use the GPU architecture, and once it is designed to use the GPU architecture it likely wouldn't be able to use the CPU anymore.  It is extremely difficult, to design software that can use both an x86 architecture and a GPU architecture to do the same work at the same time.  It might even be impossible, I've certainly never seen it.

And I'm not lecturing you on PhysX, was talking to someone else.  For someone that claims to know so much and be so smart I'd think you'd be able to grasp the concept that when I quote someone and then put a response under it I'm talking to that person and no you.  Every response in the thread isn't a response to you, I hate to burst your bubble, but you aren't the center of the universe.  The universe does not revolve around you.


----------



## nt300 (Mar 10, 2013)

Good luck nv, it aint happening. AMD has its own solutions with its own partners.


Bunchies said:


> *isnt the ps4 using amd hardware?* wait wtf why do i even care lol
> 
> just makes no sense


Because AMD will have control and a strong say about gaming from now on, they run al the new consoles now.


----------



## Slizzo (Mar 10, 2013)

nt300 said:


> Good luck nv, it aint happening. AMD has its own solutions with its own partners.
> 
> Because AMD will have control and a strong say about gaming from now on, they run al the new consoles now.



Developers still have the choice on what physics engine they want to use. I can see many using PhysX for PS4 titles if they are developing PC titles in conjunction with them. nVidia still has a leg in this game (but of course not a large one, unless they allow PS4's PhysX implementation to use AMD hardware to accelerate).


----------



## Xzibit (Mar 15, 2013)

Gamespot article.
PS4 not worth the cost, says Nvidia



> Chip-maker Nvidia didn't want to work with Sony "at the price those guys were willing to pay."



Looks like Titan wasnt the only thing over-priced


----------

