• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD TressFX Technology Detailed

Its part of DirectX. There is this company called Microsoft. Makes OS for PCs, Phones, Tablets and has a gaming console. It not just helps AMD but Microsoft. It can cross promote games with features that are further enhanced on Games for Windows. Even if its not embrased by PS4 or Wii/U.

Equally I seriously doubt Sony gives a toss about DirectX and who embraces it. Again, there is a wide range of technologies licensed on both sides which developers are free to use, they don't have to be mutually exclusive.
 
Equally I seriously doubt Sony gives a toss about DirectX and who embraces it. Again, there is a wide range of technologies licensed on both sides which developers are free to use, they don't have to be mutually exclusive.

Sony might not but developers will have a instantly bigger pool of potential buyers for there products and they will want to 1-up each other.

GameCube = ATi Flipper / PS2 = GS / X-Box = Nvidia NV2A

Wii = ATi Hollywood / PS3 = Nvidia SCEI RSX / X-Box 360 = ATi Xenos

Wii U / PS4 / X-Box 720 = AMD Radeon GPUs

Just in numbers alone it opens up and your not hindering yourself as a developer in anyway if you choose to go PC unlike a proprietary API which is dependant on the end user having a certain hardware type.

This might just be in the PC version as of now but the potential to getting adopted and used is much greater then before and has a higher chance given what we currently know of whats going to be in the Next-Gen consoles.
 
If this includes realistic boob movement i will sell my two 680s and get two 7970s
Nice. One for each boob I reckon?

Anyways.. didn't Alice Madness Returns had similar hair effects?
 
Last edited:
Anyways.. didn't Alice Madness Returns had similar hair effects?
Alice Madness Returns hair is basically the same thing but isn't GPU accelerated or as accurate as Tomb Raiders.
 
So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
F**k that.

It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
The guy looks better with a bald head. Period.

If anyone bothered to work with some art ANYONE could make some rather realistic har using CPU alone. You wouldn't have 3 billions of hair strands but if anyone would even bother to make clusters of hair taht move based on head movement, freakin CPU Havok could do that. But instead, no one even bothered to do that. Instead pretty much all games used 100% static hair.
So why 100% static or 100% super duper HW accelerated. Like no one knows how to fuckin make something in the middle. They always have to do it on one or another extreme...

I am pretty sure that hair on the bald guy was not done by amd's tressfx.

At least this is a step in the right direction. Of course more physics all around would be better. Isn't that a developers not doing it not amd or nvideas fault?
 
So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
F**k that.

It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
The guy looks better with a bald head. Period.

If anyone bothered to work with some art ANYONE could make some rather realistic har using CPU alone. You wouldn't have 3 billions of hair strands but if anyone would even bother to make clusters of hair taht move based on head movement, freakin CPU Havok could do that. But instead, no one even bothered to do that. Instead pretty much all games used 100% static hair.
So why 100% static or 100% super duper HW accelerated. Like no one knows how to fuckin make something in the middle. They always have to do it on one or another extreme...

How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
 
The improvements when enabled are really rather stunning:
e70a931127.png

Are they going to sell the bottled version too? I could use some.
 
Of course it will. But NV cards will take a performance hit. There are a few games out there which use the direct compute ability of GCN not because it's the only way to make the game look good but because they are Gaming Evolved titles.

I'm not programer so I don't really know if it is or it is not the only way or the most efficient way of accomplish a goal, but the results are there. It's not an overly tessellated object with 10k polys instead of a few hundred or a thousand, it's not some invisible ocean underneath the map and it is most definitely not AMD locked.

nVIDIA will most likely catch up with it's future generation of hardware, jut like AMD did with tessellation. After all, they are all part of DX11, so using them in games means using the features that this API has indifferent to the capabilities of each side (hopefully, as efficient as possible). nVIDIA made a bet and perhaps it lost it. Future will tell. At this point, if nVIDIA made some better compute parts from 6xx series, this would not be such a big fuss. But then again, perhaps 680 and the rest would not be as fast as 7970 and their direct contestants.

As far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.
 
How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturing against Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdown of PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD reply by telling world+dog that PhysX is irrelevant.

Personally, I'd say that having watched the whole thing unfolding its a case of handbags at ten paces. AMD dithering twice on entering the GPU physics market and losing out...twice, then coming on strong with the "we didn't want it anyway" gambit, Nvidia playing hardball in response to trash talk. Neither of the companies covers themselves in glory, but strangely enough, a whole bunch of people forgetting that the bitch-slapping involved both camps.
As far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.
From the timelines, this must have been the case. There is no way, short of an educated guess, that AMD would have expected Nvidia to pare down GK104 with respect to compute- especially given Nvidia's previous GPUs since G80. Having said that, I doubt that AMD wouldn't capitalize on the fact now that they are aware of it. From this PoV, Titan's entry into the market makes a convenient backstop (for Nvidia) should AMD look to code endless compute loops into future titles. It might have crippled the performance of pre-GK110 Nvidia (and VLIW4/5 AMD) at the expense of GCN, but Titan's appearance would probably convince AMD that massive compute additions might not be the PR boon they could have been.
 
Last edited:
Last edited:
Guys, when to refer to Lara Croft, they are breasts not boobs.
 
Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

physX has always disabled most of its 'features' if ran on a CPU. it ran on CPU for the consoles, but on PC it would always turn off or reduce effects drastically for CPU use (yes, even when sufficient CPU power was provided). it was deliberately sabotaged to make it 'better' when GPU accelerated.
 
  • Like
Reactions: xvi
I hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair
 
I hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair

larry croft and the chest hair of doom
 
Imagine the Nude patch for Skyrim will be downloaded one trillion times just to see the full bushy ladies.
 
How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturing against Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdown of PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD reply by telling world+dog that PhysX is irrelevant.

Personally, I'd say that having watched the whole thing unfolding its a case of handbags at ten paces. AMD dithering twice on entering the GPU physics market and losing out...twice, then coming on strong with the "we didn't want it anyway" gambit, Nvidia playing hardball in response to trash talk. Neither of the companies covers themselves in glory, but strangely enough, a whole bunch of people forgetting that the bitch-slapping involved both camps.

From the timelines, this must have been the case. There is no way, short of an educated guess, that AMD would have expected Nvidia to pare down GK104 with respect to compute- especially given Nvidia's previous GPUs since G80. Having said that, I doubt that AMD wouldn't capitalize on the fact now that they are aware of it. From this PoV, Titan's entry into the market makes a convenient backstop (for Nvidia) should AMD look to code endless compute loops into future titles. It might have crippled the performance of pre-GK110 Nvidia (and VLIW4/5 AMD) at the expense of GCN, but Titan's appearance would probably convince AMD that massive compute additions might not be the PR boon they could have been.

I think you missed the
sarcasm-meter-jpg.8323


So they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
F**k that.

500pxShopped.jpg
 
I hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair

There will indeed be a demand.
Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.

It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.
 
There will indeed be a demand.
Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.

It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.

Leisure Suit Laura?
 
Btw it made me thought, that old Lara had BIGGER boobs - damn it...

A PATCH, we demand a game PATCH :roll:

But they were triangular, which was pretty disgusting.
 
Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturing against Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdown of PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD reply by telling world+dog that PhysX is irrelevant.

You kind of left the most important part out. Aegia developed it for CPUs (remember the short lived PPU craze) and when Nvidia bought them they pretty much only developed it for there GPU. It wasnt after pressure that it was "crippled" for CPU instruction code which Nvidia didnt care for cause it wouldnt distinguish them from Havok.
It was more of a I got the shiny ball and i'll make it work for my hardware and I dont care as much if it works with yours so buy our products.

Nvidia was its own worst enemy and it was well documented in articles. It took Nvidia 3yrs to recognize there selfish mistake but by that time it was too late.
Not to mention they are doing it again with Tegra, Tegra optimize games sold at the Tegra Zone.

If you were to beleive Nvidia statement it was tossing PhysX at the developers and there were rejecting it. Not to mention that PhysX seamed to be a PS3 priority and PCs were taking a backseat so they were content to only update PhysX for the GPU and leave CPU behind.
After all why update it if developers arent asking for it and you can use it to your advantage to sell your GPUs as an added feature. Makes the buying of Aegia pointless if your not.

Thats why they are only how many titles that support PhysX in 5yrs ?:confused:

Games using Havok doesnt look like 0 :ohwell:
Havok FX was cancelled for that very same reason why PhysX had been failing. Hardware dependant and limits developers potential customer base. Intel probably had sense and decided not every pc will have the same GPU or a Nvidia/AMD GPU but most will have a CPU and that would be the best going forward.
 
Last edited:
But they were triangular, which was pretty disgusting.

Nope, they weren't they allowed more precious polygons being spent on her, for a saint cause of course :D

Despite it was PSX and that did tax a lot... :D
 
You kind of left the most important part out. Aegia developed it for CPUs (remember the short lived PPU craze) and when Nvidia bought them they pretty much only developed it for there GPU.

Ageia had 2 PCI-ex add-on cards with big PPUs dedicated only for physics. Only after nvidia bought them and develop for CUDA, psysx could have be run in software mode... :eek:
 
Ageia had 2 PCI-ex add-on cards with big PPUs dedicated only for physics. Only after nvidia bought them and develop for CUDA, psysx could have be run in software mode... :eek:

Nope. Ageia ran NovodeX (PhysX) in software mode as far back as 2005 in all 3 consoles just not accelerated through GPUs.
 
Last edited:
Back
Top