# AMD TressFX Technology Detailed



## btarunr (Feb 26, 2013)

AMD unveiled the new TressFX technology it teased us with, earlier this week. The technology, as predicted, works to create realistic hair rendering and physics; but we imagine it could be applied to foliage and hopefully, furry donuts as well. It will be first implemented in the 2013 reboot title of the Tomb Raider franchise, in which Lara Croft finally parted with her braid. TressFX helps accurately render Croft's hair, drawing finer locks of hair than pre-rendered hair textures plastered on bigger hair polygons that look unnatural. The free and fluid nature of these locks can then be used to accurately draw the effects of wind and water onto the hair. Below are a few before-after instances of TressFX.

Technically, TressFX is a toolset co-developed by AMD and Crystal Dynamics, which taps into DirectCompute to unlock the number-crunching prowess of the GPU (specifically Graphics CoreNext ones), to render individual strands of hair. It is built on the foundation laid by AMD's work on Order Independent Transparency (OIT), and uses Per-Pixel Linked-List (PPLL) data structures to manage rendering complexity and memory usage. DirectCompute is additionally used to process the physics of these strands of hair, which are affected by the character's motion, and elements such as wind and water/rain. TressFX will be implemented at least on the PC version of the upcoming Tomb Raider.



 



*View at TechPowerUp Main Site*


----------



## theJesus (Feb 26, 2013)

So is this something that will only work on AMD cards?


----------



## okidna (Feb 26, 2013)

btarunr said:


> It will be first implemented in *the 2013 reboot title of the Tomb Raider franchise, in which Lara Croft finally parted with her braid*.



She already parted with her braid in "Tomb Raider : Legend", back with braid in "Tomb Raider : Anniversary" and without braid again in "Tomb Raider : Underworld" and "Lara Croft and the Guardian of Light".

Call me braid-mania 

On topic, looks nice, and as usual hoping for low performance hit when this hair-thingy enabled 



theJesus said:


> So is this something that will only work on AMD cards?



AMD didn't say anything about compatibility but here's a hint :



> DirectCompute is additionally utilized to perform the real-time physics simulations for TressFX Hair. This physics system treats each strand of hair as a chain with dozens of links, permitting for forces like gravity, wind and movement of the head to move and curl Lara’s hair in a realistic fashion. Further, collision detection is performed to ensure that strands do not pass through one another, or other solid surfaces such as Lara’s head, clothing and body. Finally, hair styles are simulated by gradually pulling the strands back towards their original shape after they have moved in response to an external force.
> 
> *Graphics cards featuring the Graphics Core Next architecture, like select AMD Radeon™ HD 7000 Series, are particularly well-equipped to handle these types of tasks, with their combination of fast on-chip shared memory and massive processing throughput on the order of trillions of operations per second.*



http://blogs.amd.com/play/tressfx/


----------



## AphexDreamer (Feb 26, 2013)

Can this be used to help Crysis 3's Ropy Physics? Maybe make it more hairy?


----------



## Ferrum Master (Feb 26, 2013)

AMD has a crush on Lara... no doubt...

They should make a flurry cat demo


----------



## theubersmurf (Feb 26, 2013)

TMM's hairy doughnut exposed? Yikes.


----------



## dj-electric (Feb 26, 2013)

If this includes realistic boob movement i will sell my two 680s and get two 7970s


----------



## Ferrum Master (Feb 26, 2013)

Dj-ElectriC said:


> If this includes realistic boob movement i will sell my two 680s and get two 7970s



Btw it made me thought, that old Lara had BIGGER boobs - damn it...

A PATCH, we demand a game PATCH


----------



## AsRock (Feb 26, 2013)

Dj-ElectriC said:


> If this includes realistic boob movement i will sell my two 680s and get two 7970s




I was thinking that went the first news hit TPU.   

BoobFX for the win


----------



## LAN_deRf_HA (Feb 26, 2013)

I was very disappointed to see her boobs are inanimate in the gameplay I've seen. And I don't even mean that in a har har way it just feels like we're way past time for having more fluid bodies in video games. Enough of this everything is moving rocks stuff.

Also this http://www.ouidad.com/TressFX-Curl-Styling-Gel


----------



## TRWOV (Feb 26, 2013)

theJesus said:


> So is this something that will only work on AMD cards?



As it is based on DirectCompute it should work in every card that supports it but will run undoubtedly better on HD7000 cards due to their tremendous DC performance compared to other cards.


----------



## alienstorexxx (Feb 26, 2013)

TRWOV said:


> As it is based on DirectCompute it should work in every card that supports it but will run undoubtedly better on HD7000 cards due to their tremendous DC performance compared to other cards.



i agree


----------



## renz496 (Feb 26, 2013)

i wonder if this will give amd cards definite advantage for this game. just a few days ago they were claiming Crysis 3 was very optimized for their card. but any performance review that i can find shows  that GTX680 are pretty much equal to 7970 Ghz Ed in term of performance.


----------



## RejZoR (Feb 26, 2013)

All this fuss ONLY about hair? Are they fuckin shitting me? Not even a full fledged physics engine. Hair. Let me say it again. HAIR. Dafuck?


----------



## badtaylorx (Feb 26, 2013)

RejZoR said:


> All this fuss ONLY about hair? Are they fuckin shitting me? Not even a full fledged physics engine. Hair. Let me say it again. HAIR. Dafuck?




this is worth the effort... bad hair is just another immersion hindrance.  the realer the better imo. 

wonder how long it will take em to get a woman's "wet hair" look down???

try that with chains!!! lol


----------



## RejZoR (Feb 26, 2013)

So hair is more important than a fully destructible world with physical environmental weather effects. I am amazed how can they waste so much potential on something as unimportant as hair.
Sure it's nice if it's realistic but c'mon!? Lara had physics affected hair in Tomb Raider, back in 1998 ?
It was basic and all but they did it on shit CPU's from that time and it sort of looked pretty good, Lara's hair tail swinging around...


----------



## SaltyFish (Feb 26, 2013)

RejZoR said:


> So hair is more important than a fully destructible world with physical environmental weather effects. I am amazed how can they waste so much potential on something as unimportant as hair.
> Sure it's nice if it's realistic but c'mon!? Lara had physics affected hair in Tomb Raider, back in 1998 ?
> It was basic and all but they did it on shit CPU's from that time and it sort of looked pretty good, Lara's hair tail swinging around...



Rendering good flow in CGI has always been difficult. Even today, take a look at characters with loose hair past their necks in a CGI cartoon or a video game. They don't do realistic flow, if any. That's why you often see ponytails and braids and really short hair in such works.

Personally, I think the hair flow is a bigger achievement than an engine with fully destructible environments and/or environmental effects. Fully destructible environments are limited by game developers not wanting to deal with the possibilities. How deep should a hole be allowed in the ground? Should the player be allowed to blow down a wall to enter any building? You can see how that's a less of a hardware limitation (even though it would be memory intensive). As for environmental effects, you can easily fake wet telephone poles with texture changes when it starts raining. Yeah, hair flow is not as "felt" on the gaming side. But it's still revolutionary when it comes to immersion and it's something future developers can easily use without affecting the more mechanical parts of their games.

If it makes you feel better, the hair flow effects can theoretically be applied to other things such as tall grass and animal fur.

Also, anyone have a video of TressFX in action rather than static screenshots?


----------



## GSquadron (Feb 26, 2013)

This is exceptionally good, since it will be implemented on Tomb Raider for the first time! 

http://blogs.amd.com/play/tressfx/

Looks like its only between crystal dynamics and amd


----------



## omnimodis78 (Feb 26, 2013)

Isn't this exactly the same thing as Dawn's hair (nVidia's "A New Dawn" demo) - it looks amazing.  Here's nVidia's description of that aspect of the demo (and especially pay attention to the very last sentence...)

"Another area of dramatic improvement is Dawn’s hair. The original Dawn had individual hair strands, but they were few and far between. A mere 1,700 adorned her head and the shader only modeled for specular reflections. The original Dawn also used a rock hard hairspray to ensure her hair never budged a millimeter; all the GPU's horsepower was directed at rendering her character as realistically as possible. New Dawn’s hair is a giant leap forward. Thanks to DirectX 11 tessellation, she has gone from a scant 1,700 strands to 40,000 soft locks of hair. Advanced shading allows her beautiful hair to move out of the jet-black color scheme. While still a brunette, you’ll see her hair gently flowing in the wind, reflecting and transmitting light from the environment.

Because hair is so thin, aliasing is a major problem. Traditional antialiasing doesn't work well here, as a strand is often smaller than a pixel and may not be picked up by any of the four-or-so sample points. To alleviate this problem, A New Dawn has a special hair smoothing shader that inspects each strand and blurs them in the combing direction. The final result looks soft and silky, as if she just jumped out of the shower after an extensive conditioner routine."

source: http://www.geforce.com/games-applications/pc-games/a-new-dawn/description


----------



## Solidstate89 (Feb 26, 2013)

It uses DirectCompute as its underlying API. Any DX11 card should be able to utilize this, and that includes nVidia's cards. It doesn't use a proprietary implementation like nVidia's Phys-X hair demos that they put out years ago.


----------



## Xzibit (Feb 26, 2013)

Nvidias looks more like FiberFX from Lightwave.  Not to mention they never brought it into a game all those years with there "THE WAY ITS MEANT TO BE PLAYED" campaing.

Got to atleast hand it to AMD for bringing it to a game less then a year into Gaming Evolved.  Next step is to implement it into others and improve on it with time.


----------



## okidna (Feb 26, 2013)

From Bit-Tech :



> As we suspected, AMD's press release has been very carefully worded. *'TressFX is not exclusive to AMD,'* a spokesperson for the company has told us. 'It works on any DirectX11 card, similar to some other AMD-built technologies - for example Order-Independent Transparency (OIT) or High Definition Ambient Occlusion (HDAO).' Thus is the truth revealed: any DirectX11-capable graphics hardware, including those from rival Nvidia, will be able to make use of AMD's hair-rendering know-how.
> 
> Devon Nekechuck, product manager for high-end discrete desktop graphics at AMD, offers a bit more detail - and a sneaky plug for his company's GCN-based Radeon HD products: 'TressFX will definitely work on any DirectCompute-enabled device. This has roots in the core of Gaming Evolved, where we want to enable technology for all gamers, and not create proprietary features that lock out gamers that use our competitor's products. That said, TressFX is very computationally intensive, and hence games that use TressFX will really be able to benefit from high DirectCompute performance. Because of that, you will see Graphics Core Next-based GPUs excel when it's enabled.'



Source : http://www.bit-tech.net/news/hardware/2013/02/26/amd-tressfx/1


----------



## natr0n (Feb 26, 2013)




----------



## Fluffmeister (Feb 26, 2013)

The improvements when enabled are really rather stunning:


----------



## RejZoR (Feb 26, 2013)

So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
 F**k that.

It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
The guy looks better with a bald head. Period.

If anyone bothered to work with some art ANYONE could make some rather realistic har using CPU alone. You wouldn't have 3 billions of hair strands but if anyone would even bother to make clusters of hair taht move based on head movement, freakin CPU Havok could do that. But instead, no one even bothered to do that. Instead pretty much all games used 100% static hair.
So why 100% static or 100% super duper HW accelerated. Like no one knows how to fuckin make something in the middle. They always have to do it on one or another extreme...


----------



## okidna (Feb 26, 2013)

RejZoR said:


> *So they sticked some hair on a baldy who was designed to be bald to begin with.* Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
> F**k that.



Umm.. I think "the bald guy" was supposed to be a joke? 

Anyway, check this out : http://forums.eidosgames.com/showpost.php?p=1866348&postcount=256


----------



## Solidstate89 (Feb 26, 2013)

RejZoR said:


> I hate fun.



It's a joke image, calm the hell down.


----------



## Scrizz (Feb 26, 2013)

RejZoR said:


> So they sticked some hair on a baldy ...... making Lara hairy ....
> F**k that.
> 
> .....looks like those eggs...freakin CPU Havok... fuckin....



wow dude, chillax xD


----------



## badtaylorx (Feb 26, 2013)

RejZoR said:


> So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
> F**k that.
> 
> It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
> ...



is your trolling addiction getting out of controll??

call 1-800-ST0P-DA-H8 to reach our trollers anonymous hotline

we're all here for ya' bro


----------



## MilkyWay (Feb 26, 2013)

Is it just me or does her hair TressFX look better than most of the textures in those pics?
Me: "that's nice" and moves along.


----------



## TheoneandonlyMrK (Feb 26, 2013)

I do appreciate Amds efforts and like the sound of this but I hope there next pr blumph is mmore wide reaching then this presently is , people are getting all uptight about one hot bots hair in a single game...madness.


----------



## Shihab (Feb 26, 2013)

Best hair motion I've seen usually are in Japanese games' CGI cutscenes. 
Now this TresFX thingy looks great, but not something to celebrate about independently. If it was an addition to a better physics engine/fluid simulator one would have ed along, but this is simply an overkill for something so mundane. Still, love it.

I'd rather be seeing advancement in the textures department though. Heck, throw in some uber high res textures and use that DirectCompute to decompress/manipulate them like Civ-V does, and use that memory bandwidth you're so proud of to carry them around. Make those games look real! "_Come on man. Lie to me, Jerry! *LIE TO ME*"_



RejZoR said:


> Whats next, making Lara hairy where she wasn't intended to be?



They wanted to, Platinum game's beat them to the concept though 



Spoiler












Sidenote: Don't be hatin' on baldies mate, we didn't choose to be so...


----------



## Prima.Vera (Feb 26, 2013)

I hope this will work on older generation cards, not only on 7xxxx


----------



## Lionheart (Feb 26, 2013)

Fluffmeister said:


> The improvements when enabled are really rather stunning:
> http://images.gameru.net/image/direct/e70a931127.png



Lmfao you made me spit out my organic apple juice


----------



## Solidstate89 (Feb 26, 2013)

Prima.Vera said:


> I hope this will work on older generation cards, not only on 7xxxx



Anything that supports DirectX11 will support the necessary DirectCompute APIs.


----------



## Steevo (Feb 26, 2013)

RejZoR said:


> So hair is more important than a fully destructible world with physical environmental weather effects. I am amazed how can they waste so much potential on something as unimportant as hair.
> Sure it's nice if it's realistic but c'mon!? Lara had physics affected hair in Tomb Raider, back in 1998 ?
> It was basic and all but they did it on shit CPU's from that time and it sort of looked pretty good, Lara's hair tail swinging around...



http://www.nvidia.com/object/physx_knowledge_base.html


"Cooking"


Cooking by the book means the load times between levels, or the size of games is due to the pre-cooking of many "real time Physx" effects that everyone is so crazy about in.....all.....40 titles of hardware accelerated effects,most of which only use partial library implementation.


----------



## Crap Daddy (Feb 26, 2013)

So, it's a full war going on, some direct compute there, some hairy stuff here, all to make sure AMD cards look better than Nvidia. Same old story as with TWIMTBP some time ago and physX, who gives a shit about people who just want to play games let's screw the competition and you suckers if you want to play OUR game then buy OUR card. Well my next card will probably be a console. I'm done with this.


----------



## Calin Banc (Feb 26, 2013)

I don't see a problem as long as it runs on nVIDIA cards and so it does.


----------



## SIGSEGV (Feb 26, 2013)

if only it was exclusive feature only for AMD cards, i believe they'd go more ranting and moaning. Sanitarium gonna full with these people.

the good thing is this tech can be ran into all gpus which capable dx 11 
well done AMD


----------



## LAN_deRf_HA (Feb 26, 2013)

Glad actually to finally see a multi-platform gpu physics option. This is what DX11 was supposed to bring from the get go. Maybe having AMD in next gen consoles will help make that more standard and physx can finally die.


----------



## Crap Daddy (Feb 26, 2013)

Calin Banc said:


> I don't see a problem as long as it runs on nVIDIA cards and so it does.



Of course it will. But NV cards will take a performance hit. There are a few games out there which use the direct compute ability of GCN not because it's the only way to make the game look good but because they are Gaming Evolved titles.


----------



## Fluffmeister (Feb 26, 2013)

SIGSEGV said:


> if only it was exclusive feature only for AMD cards, i believe they'd go more ranting and moaning. Sanitarium gonna full with these people.
> 
> the good thing is this tech can be ran into all gpus which capable dx 11
> well done AMD



It doesn't really matter either way, if it runs decently enough on all cards without a major performance hit then great. If on the other hand it tanks performance 20-30% plus only on GeForce cards it will just get turned off. It's not like it effects the gameplay after all.


----------



## Mussels (Feb 26, 2013)

whether people care about her improved hair or not, this tech CAN be used for other things - anything with lots of strands (think animal fur, carpet, or cloth in general)


this is a new graphical tech that can be used for far more than lara crofts armpit hair, so people need to stop hating on it just because its not a new kind of antialiasing or whatever the hell floats their boats


----------



## Xzibit (Feb 26, 2013)

Crap Daddy said:


> Of course it will. But NV cards will take a performance hit. There are a few games out there which use the direct compute ability of GCN not because it's the only way to make the game look good but because they are Gaming Evolved titles.



You could say the same for "TWIMTBP" titles from 2003-2012.  The differance is its not proprietery.  Nvidia went Cuda PhysX stagnating the industry.  Look what happen even at ORNL Cray Titan presentation it was pointed out that OpenMAA was just as fast as CUDA so you didnt need to buy into Nvidias propriety scheme to make full use of Cray Titan.

If Nvidia/ATI at the time would have used DC API it would have been maturing over the years and widely use by now.  Nvidia was in the first X-Box and decided to go PhysX to get a upper hand on competition and I bet it didnt make Microsoft all too happy.

AMD consolidating all 3 consoles might be the best thing to happen for PC gaming and gaming if it keeps on this track of willing to support these features in a non-proprietary way.


----------



## Mussels (Feb 27, 2013)

Xzibit said:


> AMD consolidating all 3 consoles might be the best thing to happen for PC gaming and gaming if it keeps on this track of willing to support these features in a non-proprietary way.



ooooh i didnt realise, the modern consoles being all AMD will make this a standard console/game feature now.


----------



## Widjaja (Feb 27, 2013)

Appears to be just a small impression of what can be done.

I will be interested on how taxing this will be on the cards in comparison to enabling standard hair.


----------



## SaltyFish (Feb 27, 2013)

LAN_deRf_HA said:


> Glad actually to finally see a multi-platform gpu physics option. This is what DX11 was supposed to bring from the get go. Maybe having AMD in next gen consoles will help make that more standard and physx can finally die.



There is OpenGL for multi-platform... but devs are too lazy to use it.


----------



## Fluffmeister (Feb 27, 2013)

Mussels said:


> ooooh i didnt realise, the modern consoles being all AMD will make this a standard console/game feature now.



Not really, multiple middleware's will be available to game devs, take this list for the PS4 for example:

http://www.scei.co.jp/ps4_tm/index.html

Assuming this tech will take off because the hardware is AMD exclusive is nonsense.


----------



## Mussels (Feb 27, 2013)

Fluffmeister said:


> Not really, multiple middleware's will be available to game devs, take this list for the PS4 for example:
> 
> http://www.scei.co.jp/ps4_tm/index.html
> 
> Assuming this tech will take off because the hardware is AMD exclusive is nonsense.



if this is the fastest one, and fully supported by the company they bought the hardware from, its going to have a pretty big advantage.


----------



## Fluffmeister (Feb 27, 2013)

Mussels said:


> if this is the fastest one, and fully supported by the company they bought the hardware from, its going to have a pretty big advantage.



The first of many "if's", it's purely up to the devs what licensed tech they use, again just because AMD is supply the underlying technology it doesn't mean their own technology will benefit.


----------



## Xzibit (Feb 27, 2013)

Fluffmeister said:


> The first of many "if's", it's purely up to the devs what licensed tech they use, again just because AMD is supply the underlying technology it doesn't mean their own technology will benefit.



Its part of DirectX.  There is this company called Microsoft.  Makes OS for PCs, Phones, Tablets and has a gaming console. It not just helps AMD but Microsoft. It can cross promote games with features that are further enhanced on Games for Windows.  Even if its not embrased by PS4 or Wii/U.


----------



## Fluffmeister (Feb 27, 2013)

Xzibit said:


> Its part of DirectX.  There is this company called Microsoft.  Makes OS for PCs, Phones, Tablets and has a gaming console. It not just helps AMD but Microsoft. It can cross promote games with features that are further enhanced on Games for Windows.  Even if its not embrased by PS4 or Wii/U.



Equally I seriously doubt Sony gives a toss about DirectX and who embraces it. Again, there is a wide range of technologies licensed on both sides which developers are free to use, they don't have to be mutually exclusive.


----------



## Xzibit (Feb 27, 2013)

Fluffmeister said:


> Equally I seriously doubt Sony gives a toss about DirectX and who embraces it. Again, there is a wide range of technologies licensed on both sides which developers are free to use, they don't have to be mutually exclusive.



Sony might not but developers will have a instantly bigger pool of potential buyers for there products and they will want to 1-up each other.

GameCube = ATi Flipper / PS2 = GS / X-Box = Nvidia NV2A

Wii = ATi Hollywood / PS3 = Nvidia SCEI RSX / X-Box 360 = ATi Xenos

Wii U / PS4 / X-Box 720 = AMD Radeon GPUs

Just in numbers alone it opens up and your not hindering yourself as a developer in anyway if you choose to go PC unlike a proprietary API which is dependant on the end user having a certain hardware type.

This might just be in the PC version as of now but the potential to getting adopted and used is much greater then before and has a higher chance given what we currently know of whats going to be in the Next-Gen consoles.


----------



## Bjorn_Of_Iceland (Feb 27, 2013)

Dj-ElectriC said:


> If this includes realistic boob movement i will sell my two 680s and get two 7970s


Nice. One for each boob I reckon?

Anyways.. didn't Alice Madness Returns had similar hair effects?


----------



## seronx (Feb 27, 2013)

Bjorn_Of_Iceland said:


> Anyways.. didn't Alice Madness Returns had similar hair effects?


Alice Madness Returns hair is basically the same thing but isn't GPU accelerated or as accurate as Tomb Raiders.


----------



## Nordic (Feb 27, 2013)

RejZoR said:


> So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
> F**k that.
> 
> It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
> ...



I am pretty sure that hair on the bald guy was not done by amd's tressfx.

At least this is a step in the right direction. Of course more physics all around would be better. Isn't that a developers not doing it not amd or nvideas fault?


----------



## TheGuruStud (Feb 27, 2013)

RejZoR said:


> So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
> F**k that.
> 
> It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
> ...



How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!


----------



## TRWOV (Feb 27, 2013)

Fluffmeister said:


> The improvements when enabled are really rather stunning:



Are they going to sell the bottled version too? I could use some.


----------



## Calin Banc (Feb 27, 2013)

Crap Daddy said:


> Of course it will. But NV cards will take a performance hit. There are a few games out there which use the direct compute ability of GCN not because it's the only way to make the game look good but because they are Gaming Evolved titles.



I'm not programer so I don't really know if it is or it is not the only way or the most efficient way of accomplish a goal, but the results are there. It's not an overly tessellated object with 10k polys instead of a few hundred or a thousand, it's not some invisible ocean underneath the map and it is most definitely not AMD locked. 

nVIDIA will most likely catch up with it's future generation of hardware, jut like AMD did with tessellation. After all, they are all part of DX11, so using them in games means using the features that this API has indifferent to the capabilities of each side (hopefully, as efficient as possible). nVIDIA made a bet and perhaps it lost it. Future will tell. At this point, if nVIDIA made some better compute parts from 6xx series, this would not be such a big fuss. But then again, perhaps 680 and the rest would not be as fast as 7970 and their direct contestants. 

As far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.


----------



## HumanSmoke (Feb 27, 2013)

TheGuruStud said:


> How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!


Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturing against Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX  (Number of Havok FX games produced : *0* )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdown of PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD reply by telling world+dog that PhysX is irrelevant.

Personally, I'd say that having watched the whole thing unfolding its a case of handbags at ten paces. AMD dithering twice on entering the GPU physics market and losing out...twice, then coming on strong with the "we didn't want it anyway" gambit, Nvidia playing hardball in response to trash talk. Neither of the companies covers themselves in glory, but strangely enough, a whole bunch of people forgetting that the bitch-slapping involved both camps.


Calin Banc said:


> As far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.


From the timelines, this must have been the case. There is no way, short of an educated guess, that AMD would have expected Nvidia to pare down GK104 with respect to compute- especially given Nvidia's previous GPUs since G80. Having said that, I doubt that AMD wouldn't capitalize on the fact now that they are aware of it. From this PoV, Titan's entry into the market makes a convenient backstop (for Nvidia) should AMD look to code endless compute loops into future titles. It might have crippled the performance of pre-GK110 Nvidia (and VLIW4/5 AMD) at the expense of GCN, but Titan's appearance would probably convince AMD that massive compute additions might not be the PR boon they could have been.


----------



## Prima.Vera (Feb 27, 2013)

Lionheart said:


> Lmfao you made me spit out my organic apple juice



What's an _*organic*_ apple juice. Can it be inorganic apple juice??


----------



## GLD (Feb 27, 2013)

Guys, when to refer to Lara Croft, they are breasts not boobs.


----------



## Mussels (Feb 27, 2013)

HumanSmoke said:


> Cool story.
> PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?



physX has always disabled most of its 'features' if ran on a CPU. it ran on CPU for the consoles, but on PC it would always turn off or reduce effects drastically for CPU use (yes, even when sufficient CPU power was provided). it was deliberately sabotaged to make it 'better' when GPU accelerated.


----------



## Prima.Vera (Feb 27, 2013)

SaltyFish said:


> There is Open*C*L for multi-platform... but devs are too lazy to use it.



There. Fixed.


----------



## Depth (Feb 27, 2013)

I hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair


----------



## Mussels (Feb 27, 2013)

Depth said:


> I hope this realistic hair trend doesn't spiral into a hairy Lara Croft
> 
> With nude mods portraying her realistic and glistening pubic hair



larry croft and the chest hair of doom


----------



## H82LUZ73 (Feb 27, 2013)

Imagine the Nude patch for Skyrim will be downloaded one trillion times just to see the full bushy ladies.


----------



## Prima.Vera (Feb 27, 2013)

perverts!


----------



## xvi (Feb 27, 2013)

HumanSmoke said:


> TheGuruStud said:
> 
> 
> > How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
> ...



I think you missed the







RejZoR said:


> So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
> F**k that.


----------



## Widjaja (Feb 27, 2013)

Depth said:


> I hope this realistic hair trend doesn't spiral into a hairy Lara Croft
> 
> With nude mods portraying her realistic and glistening pubic hair



There will indeed be a demand.
Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.

It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.


----------



## xvi (Feb 27, 2013)

Widjaja said:


> There will indeed be a demand.
> Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.
> 
> It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.



Leisure Suit Laura?


----------



## XNine (Feb 27, 2013)

Ferrum Master said:


> Btw it made me thought, that old Lara had BIGGER boobs - damn it...
> 
> A PATCH, we demand a game PATCH



But they were triangular, which was pretty disgusting.


----------



## Xzibit (Feb 27, 2013)

HumanSmoke said:


> Cool story.
> PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?
> 
> As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
> ...



You kind of left the most important part out. Aegia developed it for CPUs (remember the short lived PPU craze) and when Nvidia bought them they pretty much only developed it for there GPU.  It wasnt after pressure that it was "crippled" for CPU instruction code which Nvidia didnt care for cause it wouldnt distinguish them from Havok.
It was more of a I got the shiny ball and i'll make it work for my hardware and I dont care as much if it works with yours so buy our products.

Nvidia was its own worst enemy and it was well documented in articles.  It took Nvidia 3yrs to recognize there selfish mistake but by that time it was too late.
Not to mention they are doing it again with Tegra, Tegra optimize games sold at the Tegra Zone.

If you were to beleive Nvidia statement it was tossing PhysX at the developers and there were rejecting it.  Not to mention that PhysX seamed to be a PS3 priority and PCs were taking a backseat so they were content to only update PhysX for the GPU and leave CPU behind.
After all why update it if developers arent asking for it and you can use it to your advantage to sell your GPUs as an added feature.  Makes the buying of Aegia pointless if your not.

Thats why they are only how many titles that support PhysX in 5yrs ?

Games using Havok doesnt look like 0 
Havok FX was cancelled for that very same reason why PhysX had been failing.  Hardware dependant and limits developers potential customer base.  Intel probably had sense and decided not every pc will have the same GPU or a Nvidia/AMD GPU but most will have a CPU and that would be the best going forward.


----------



## Ferrum Master (Feb 27, 2013)

XNine said:


> But they were triangular, which was pretty disgusting.



Nope, they weren't they allowed more precious polygons being spent on her, for a saint cause of course 

Despite it was PSX and that did tax a lot...


----------



## Prima.Vera (Feb 27, 2013)

Xzibit said:


> You kind of left the most important part out. Aegia developed it for CPUs (remember the short lived PPU craze) and when Nvidia bought them they pretty much only developed it for there GPU.



Ageia had 2 PCI-ex add-on cards with big PPUs dedicated only for physics. Only after nvidia bought them and develop for CUDA, psysx could have be run in software mode...


----------



## Xzibit (Feb 27, 2013)

Prima.Vera said:


> Ageia had 2 PCI-ex add-on cards with big PPUs dedicated only for physics. Only after nvidia bought them and develop for CUDA, psysx could have be run in software mode...



Nope.  Ageia ran NovodeX (PhysX) in software mode as far back as 2005 in all 3 consoles just not accelerated through GPUs.


----------



## TRWOV (Feb 27, 2013)

Ageia Physx was CPU based from the get go. The PPUs were a tie in but not required.

Still nVidia's gimping has been documented several times. Basically physx on CPU runs in x87, a set of instructions that no modern software has used since 10 years ago. x87 is the lowest common denominator but I'd say that whomever wanted to play a physx enabled game would have a CPU with SSE2 support at least.


----------



## HumanSmoke (Feb 27, 2013)

Xzibit said:


> You kind of left the most important part out.


Nope. Nvidia is a GPU company, it was always intended that Nvidia PhysX would be GPU-centric. The facts of the matter are that AMD *could* have bought Ageia, *didn't* buy Ageia, and they *didn't* want to licence PhysX.


Xzibit said:


> Games using Havok doesnt look like 0


Except that 1. I wasn't referring to Havok, and 2. I was referring to OpenCL Havok FX which AMD was promoting. See virtually every link posted including the one above. The fact remains that ATi/AMD championing OpenCL physics predates Nvidia's involvement with the tech...one provided a tangible product, the other produced nothing.


Xzibit said:


> Not to mention that PhysX seamed to be a PS3 priority and PCs were taking a backseat so they were content to only update PhysX for the GPU and leave CPU behind


No (again). For some reason, you seem fixated on only one area of the tech. PhysX is also used in professional arena, as middleware, and even the    PlayStation 4 (note the physics section)


Mussels said:


> physX has always disabled most of its 'features' if ran on a CPU. it ran on CPU for the consoles, but on PC it would always turn off or reduce effects drastically for CPU use (yes, even when sufficient CPU power was provided). it was deliberately sabotaged to make it 'better' when GPU accelerated.


I don't think I made the point of stating otherwise. You think it strange that a GPU company would buy a technology and then integrate it into their product stack rather than optimize it for their competitors ?


TRWOV said:


> Still nVidia's gimping has been documented several times. Basically physx on CPU runs in x87, a set of instructions that no modern software has used since 10 years ago. x87 is the lowest common denominator but I'd say that whomever wanted to play a physx enabled game would have a CPU with SSE2 support at least.


PhysX has used SSE2 compile for almost three years.


----------



## TheoneandonlyMrK (Feb 27, 2013)

TRWOV said:


> Ageia Physx was CPU based from the get go. The PPUs were a tie in but not required.
> 
> Still nVidia's gimping has been documented several times. Basically physx on CPU runs in x87, a set of instructions that no modern software has used since 10 years ago. x87 is the lowest common denominator but I'd say that whomever wanted to play a physx enabled game would have a CPU with SSE2 support at least.



I gotta say that for me its nvidias entire handling of aigea and physx thats riled me most in recent years as well as their blatant money grabbing obv , but either way im passing them minimal cash these days and irs their fault. 

OT Mussels I appreciate your feedback on the way this tech might be used, I had not considered grass etc, I like it even more now and I think Amd should be applauded for bringing us an open multi format physics feature which will no doubt be expanded upon.
And its open enough to run on enemy radars from the get go,  oh no they didn't. .; o


----------



## HumanSmoke (Feb 27, 2013)

james888 said:


> I am pretty sure that hair on the bald guy was not done by amd's tressfx.


Correct, AMD's patented StressFX causes that effect


----------



## TRWOV (Feb 27, 2013)

HumanSmoke said:


> PhysX has used SSE2 compile for almost three years.



Oh, I'll shut my hole then. In my defense PPUs support up to 2.8.1 so I wasn't aware of the change.


----------



## Lionheart (Feb 28, 2013)

Prima.Vera said:


> What's an _*organic*_ apple juice. Can it be inorganic apple juice??



Lolz organic as in grown naturally, no pesticides, no herbicides, no GMO's, no preservatives, not reconstituted, 100% healthiness


----------



## Vertrucio (Feb 28, 2013)

Not really, "organic" is a sham label.

They still use pesticides, they just happen to use silly stuff that seems natural, oh you know, stuff like concentrated urine. Look it up.  

Back to this. Just think of TressFX as a stepping stone to a full, non-proprietary API GPU physics system.

I mean, let's be honest here, with AMD doing all the CPU and GPU ships for both next gen consoles, with the PS4 already shown to have quite capable GPU physics from its demos, it's pretty clear that AMD has already created many technologies that make PhysX's proprietary nvidia only gpu physics modes obsolete.

Now, as these consoles are closer to release, more and more developers will be working with these new non-proprietary APIs, allowing (and forcing) all developers to use these physics tools.


----------



## Roph (Feb 28, 2013)

Hopefully this can be another nail in PhysX's coffin


----------



## TheHunter (Feb 28, 2013)

heh so here is another nvidia physx bull vs AMD tressFX direct compute physics debate? I 've seen this in other forums too.. xD

NV physx3 HW is still crippled, even with improved SSE2, physx 2.8.x is much worse - in HW layer, SW is ok but nothing special compared to any other physics engine. 



Btw nvidia's hair tech demo from y2010 doesnt use physx, but tessellation and direct compute physics calculations, just like this tressFX.


Imo AMD nailed it with this tressFX, apparently its used in upcoming Luminous engine as well - for hair rendering.


----------



## Wshlist (Feb 28, 2013)

*On ageia*



HumanSmoke said:


> Nope. Nvidia is a GPU company, it was always intended that Nvidia PhysX would be GPU-centric. The facts of the matter are that AMD *could* have bought Ageia, *didn't* buy Ageia, and they *didn't* want to licence PhysX.



Actually ageia's system was designed to be quite different and its cards had a task-specific design. What nvidia did was basically kill the original concept, and then made their own system which was a poor substitute for the original concept since they have to do it on a graphics card. And it would have been an enormous shame except that ageia was not going to make it anyway I expect.


----------



## HumanSmoke (Feb 28, 2013)

Wshlist said:


> Actually ageia's system was designed to be quite different and its cards had a task-specific design. What nvidia did was basically kill the original concept, and then made their own system which was a poor substitute for the original concept since they have to do it on a graphics card. And it would have been an enormous shame except that ageia was not going to make it anyway I expect.


Don't.Care.What.Ageia.Intended.

My post stated:


> Nope. *Nvidia* is a GPU company, it was always intended that *Nvidia PhysX *would be GPU-centric.


To put this into context:
Nvidia make GPUs - they don't make CPUs
Why would Nvidia buy IP, then optimize it to not only benefit their competition (who make CPUs), but to make their own GPUs irrelevant in using the technology? <<<Not rhetorical. If you can tell me why Nvidia would buy Ageia to benefit AMD and Intel and marginalize their own products I'd be much interested to know.



Wshlist said:


> NV physx3 HW is still crippled, even with improved SSE2, physx 2.8.x is much worse - in HW layer, SW is ok but nothing special compared to any other physics engine


That's probably more about keeping backward compatibility (esp. with consoles), and the fact that game dev's are a pretty lazy bunch that need spoon feeding. I think you'll find that the 3.x SDK is also available.

As far as popularity goes, it waxs and wanes based on individual titles. With many game engines now incorporating their own physics engines ( e.g. Frostbyte 2), PhysX is just one of a whole raft of game orientated physics engines...but then again, just like Lagoa and other pro/scientific function physics engines, PhysX has uses outside of eye candy.


----------



## Wshlist (Feb 28, 2013)

Well I'm not in the whole extensive discussion and only remarked on the history part, but the whole difference between the ageia version and the nvidia version is that the ageia version was usable as a scientific thing, whereas the nvidia incarnation is just for games really.

And that's the part that is a pity really.
But of course that's all semi-superseded by the development the last few years with openCL and DirectCompute and even CUDA and such. And changes in the hardware also helped, so we are at a different point than when ageia first released their idea.

Still we should not re-write history either and ageia much more focused on true parallelism in calculation and accessing data.


----------



## Rowsol (Feb 28, 2013)

Looks good.


----------



## Lazzer408 (Feb 28, 2013)

I found this on avforums and thought it was funny.


----------



## partyboy75 (Mar 1, 2013)

As I can see the Lara Croft's ass still consists of poligons.


----------



## Fluffmeister (Mar 4, 2013)

TombRaider 2013 Hair TressFx First look - YouTube

Tomb Raider - Scavenger Den Gameplay [tressFX] - Y...

Meh.


----------



## Prima.Vera (Mar 4, 2013)

*Nice*. To bad is only the hair. We want moar physics from AMD.


----------



## Kaynar (Mar 6, 2013)

Roph said:


> Hopefully this can be another nail in PhysX's coffin



I think its better to say "hopefully AMD will continue to develop their physics dedicated software" so that we end up having more games with physics features that will be supported by PhysX and AMD's similar tech. While you can currently run PhysX through CPU (software) if you got AMD gpu its very "heavy" for the system , so if AMD would develop an equilalent software that takes advantage of their gpu's tech then everyone would benefit from it (both gpu companies, developers and customers as well)


----------



## Xzibit (Mar 6, 2013)

Kaynar said:


> I think its better to say "hopefully AMD will continue to develop their physics dedicated software" so that we end up having more games with physics features that will be supported by PhysX and AMD's similar tech. While you can currently run PhysX through CPU (software) if you got AMD gpu its very "heavy" for the system , so if AMD would develop an equilalent software that takes advantage of their gpu's tech then everyone would benefit from it (both gpu companies, developers and customers as well)



Its DirectX API. So its not beholden to either company. As long as a GPU or APU is DirectX 11 or above compliant it should be fine.


----------



## GSquadron (Mar 6, 2013)

It just needs optimization


----------



## Xzibit (Mar 7, 2013)

There is an interesting article that BSN put up on the Tomb Raider ordeal.

Tomb Raider: AMD Touts TressFX Hair as Nvidia Apologizes for Poor Experience.



> Furthermore, in off-the-record discussions with the developers, we learned that Nvidia no longer invests as much in PC gaming developer teams as it used to, as Tegra is viewed as the main growth driver in the company. Naturally, this is purely one sided view, but a view coming from several game development companies which combined shipped over 100 million units.





> Will the recent turn of events cause Nvidia to invest more money into PC game developer support program? Only time will tell but for now the tide seems to be turning towards AMD in a pretty big way. CryTek (Crysis 3), Irrational Games (BioShock Infinite), Ubi Soft (Far Cry 3), Maxis (Sim City), Electronic Arts, Square Enix, Ubi Soft... this can no longer be dismissed as exceptions, they're a tidal wave of game developers and publishers shifting towards the red color. *They obviously decided to never settle for second best*, pun intended.



Funny pun


----------



## AphexDreamer (Mar 7, 2013)

TressFX, more like StressFX. 

The look don't justify the lowered performance in my case.


----------



## Mussels (Mar 8, 2013)

AphexDreamer said:


> TressFX, more like StressFX.
> 
> The look don't justify the lowered performance in my case.



what lowered performance? did i miss a performance analysis somewhere?


----------



## AphexDreamer (Mar 8, 2013)

Mussels said:


> what lowered performance? did i miss a performance analysis somewhere?



Drops my frames to the teens in cut scenes (especially when it's really focused on her head) and sometimes 20s just in gameplay. My 5870 just can't cut it.


----------



## Mussels (Mar 8, 2013)

AphexDreamer said:


> Drops my frames to the teens in cut scenes (especially when it's really focused on her head) and sometimes 20s just in gameplay. My 5870 just can't cut it.



to be fair, as a fellow 5870 owner... they ARE several generations old. its nice we have the feature at all, and that its not exclusive to the new hardware.


----------

