Thursday, July 17th 2008

Three New NVIDIA Tools Help Dev's Quickly Debug and Speed-up Games

Today's top video games use complex programming and rendering techniques that can take months to create and tune in order to get the image quality and silky-smooth frame rates that gamers demand. Thousands of developers worldwide including members of Blizzard Entertainment, Crytek, Epic Games, and Rockstar Games rely on NVIDIA development tools to create console and PC video games. Today, NVIDIA has expanded its award-winning development suite with three new tools that vastly speed up this development process, keeping projects on track and costs under control.

The new tools which are available now include:
  • PerfHUD 6-a graphics debugging and performance analysis tool for DirectX 9 and 10 applications.
  • FX Composer 2.5-an integrated development environment for fast creation of real-time visual effects.
  • Shader Debugger-helps debug and optimize shaders written with HLSL, CgFX, and COLLADA FX Cg in DirectX and OpenGL.

"These new tools reinforce our deep and longstanding commitment to help game developers fulfill their vision," said Tony Tamasi, vice president of technical marketing for NVIDIA. "Creating a state-of-the-art video game is an incredibly challenging task technologically, which is why we invest heavily in creating powerful, easy-to-use video game optimization and debugging tools for creating console and PC games."

More Details on the New Tools

PerfHUD 6 is a new and improved version of NVIDIA's graphics debugging and performance analysis tool for DirectX 9 and 10 applications. PerfHUD is widely used by the world's leading game developers to debug and optimize their games. This new version includes comprehensive support for optimizing games for multiple GPUs using NVIDIA SLI technology, powerful new texture visualization and override capabilities, an API call list, dependency views, and much more. In a recent survey, more than 300 PerfHUD 5 users reported an average speedup of 37% after using PerfHUD to tune their applications.

"Spore relies on a host of graphical systems that support a complex and evolving universe. NVIDIA PerfHUD provides a unique and essential tool for in-game performance analysis," said Alec Miller, Graphics Engineer at Maxis. "The ability to overlay live GPU timings and state helps us rapidly diagnose, fix, and then verify optimizations. As a result, we can simulate rich worlds alongside interactive gameplay. I highly recommend PerfHUD because it is so simple to integrate and to use."

FX Composer 2.5 is an integrated development environment for fast creation of real-time visual effects. FX Composer 2.5 can be used to create shaders for HLSL, CgFX, and COLLADA FX Cg in DirectX and OpenGL. This new release features an improved user interface, DirectX 10 Support, ShaderPerf with GeForce 8 and 9 Series support, visual models and styles, and particle systems.

As longer, more complex shaders become pervasive, debugging shaders has become more of a challenge for developers. To assist developers with this task, NVIDIA introduces the brand-new NVIDIA Shader Debugger, a plug-in for FX Composer 2.5 that enables developers to inspect their code while seeing shader variables applied in real time on their geometry. The Shader Debugger can be used to debug HLSL, CgFX, and COLLADA FX Cg shaders in both DirectX and OpenGL.

The NVIDIA Shader Debugger is the first product in the NVIDIA Professional Developer Tools lineup. These are new tools directed at professional developers who need more industrial-strength capabilities and support. For example, the NVIDIA Shader Debugger will run on leading GPUs from all vendors.

In addition to the free versions available for non-commercial use, some of the new tools are subject to a license fee, but are priced to be accessible to developers. Existing free tools (such as FX Composer, PerfHUD, Texture Tools, and SDKs) will not be affected-they will continue to be available to all developers at no cost. Shader Debugger pricing information is available at www.shaderdebugger.com.

NVIDIA encourages developers to visit its developer web site here and its developer tools forums here.
Source: NVIDIA
Add your own comment

19 Comments on Three New NVIDIA Tools Help Dev's Quickly Debug and Speed-up Games

#1
PCpraiser100
Any room for the red team?

And the chances of optimized games for ATI would be......?
Posted on Reply
#2
Nkd
this is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
Posted on Reply
#3
btarunr
Editor & Senior Moderator
PCpraiser100And the chances of optimized games for ATI would be......?
Microsoft DirectX 10.1

....unless NV comes up with its DX10.1 fleet.
Posted on Reply
#4
phanbuey
Nkdthis is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
Yep... you hit that nail on the head.
Posted on Reply
#5
Nkd
well dx10.1 wont make much of a difference. even if you look at it with the dx10.1 patch the gtx 280 performs just like hd 4870, dx10.1 is only good for AA performance, well that has been the case for Assasin's Creed.
Posted on Reply
#6
btarunr
Editor & Senior Moderator
You'll soon know. DX10.1 addresses several AA related issues with DX10. A DX10.1 game engine churns out more fps than DX10, NV's developer relations would be useless. The increments are hypothetically huge.
Posted on Reply
#7
PCpraiser100
Nkdthis is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
Ya really, luckily the HL2 series is one of my favorite games as the Source Engine has scalable performance optimized for both sides in the generation:toast:. I mean like both companies have their unique strengths and weaknesses which sets them apart:slap:, why can't they just compete fairly with mediocre game engines that don't give a $h** for any side so that we all know who is the best on the block. I know for a fact that ATI will bust this whole graphics propaganda bull$h** especially for the position they are in. So by the looks of it because Microsoft french kisses Nvidia, they will replaced the DirectX ATI is really getting good at and then start with DirectX 11 announcements. Nvidia, you are a snitch SOB who thinks about living a lie because you can't handle the truth that you just bride your way out of situations.
Posted on Reply
#8
phanbuey
PCpraiser100why can't they just compete fairly with mediocre game engines that don't give a $h** for any side so that we all know who is the best on the block....Nvidia, you are a snitch SOB who thinks about living a lie because you can't handle the truth that you just bribe your way out of situations.
Its not unfair at all... Nvidia pours tons of money and support into game development. Ati doesnt have money, so they pour money into designing better hardware. ATI cards still run great - Look at Crysis. Its optimized for Nvidia, but it runs amazing on the 4870.

DX 10 is not that special, Im glad Microshaft is moving to DX11...

Speaking of... @ bta: Is DX11 going to be backwards compatible with DX10? Does it fall back to DX10 if the card doesnt support it like DX 10.1 does?

EDIT: although the crap they pulled with the Assasin's Creed patch makes them deserve the "snitch SOB" label. :laugh:
Posted on Reply
#9
PCpraiser100
phanbueyIts not unfair at all... Nvidia pours tons of money and support into game development. Ati doesnt have money, so they pour money into designing better hardware. ATI cards still run great - Look at Crysis. Its optimized for Nvidia, but it runs amazing on the 4870.

DX 10 is not that special, Im glad Microshaft is moving to DX11...

Speaking of... @ bta: Is DX11 going to be backwards compatible with DX10? Does it fall back to DX10 if the card doesnt support it like DX 10.1 does?
In some cases your right phanbuey, but there are some ways Nvidia uses to cut corners. My friend use to own a 6600GT and decided to play BF2 on full settings (BTW BF2 is an Nvidia game) and when we played it at full settings a big freeze suddenly occurred and then we saw that all the textures were less pixelated. Huge bust for Nvidia. And did you know that Crysis could optimize the 8800GT by renaming a certain config file in any way? Another bust. And then I played HL2 Episode One and CS Source on my borrowed friend's computer which had a 7900GTX and in the games HDR bloom was on steroids. But when I looked closely at the lit up props in the games the reflections and textures were not tat pixelated compared to other props around me in the game (full settings 1600x1200 by the way). Another bust for Nvidia as when I moved away from the HDR areas my framerates went down by a 30fps drop.
Now onto DX11, I am kinda glad they are developing it but it just worries me on the performance point-of-view that because Nvidia is the closer partner, they will be looking at the API's development closely. And since DX11 is one or two years away from now, Nvidia has got themselves a bit headstart in getting their GTXwahtever-the-heck series ready for DX11 games.
Posted on Reply
#10
mamisano
Nkdthis is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
And where are you getting your facts from?
Here is a list of AMD/ATI development tools for Radon products:

# » AMD Tootle (Triangle Order Optimization Tool)
# » ATI Compress
# » CubeMapGen
# » GPU Mesh Mapper
# » GPU PerfStudio
# » GPU ShaderAnalyzer
# » Normal Mapper
# » OpenGL ES 2.0 Emulator
# » RenderMonkey™
# » The Compressonator
# » AMD Stream™
# » HLSL2GLSL

More info on each can be found here: developer.amd.com/gpu/Pages/default.aspx
Posted on Reply
#11
panchoman
Sold my stars!
wonder what perfhud will tell crytek :p
Posted on Reply
#12
PCpraiser100
mamisanoAnd where are you getting your facts from?
Here is a list of AMD/ATI development tools for Radon products:

# » AMD Tootle (Triangle Order Optimization Tool)
# » ATI Compress
# » CubeMapGen
# » GPU Mesh Mapper
# » GPU PerfStudio
# » GPU ShaderAnalyzer
# » Normal Mapper
# » OpenGL ES 2.0 Emulator
# » RenderMonkey™
# » The Compressonator
# » AMD Stream™
# » HLSL2GLSL

More info on each can be found here: developer.amd.com/gpu/Pages/default.aspx
I wonder if Valve is using any of these tools?

I heard Postal 3 is going to be powered by the Source engine and it has wide environments so these tools might work to keep ATI on the playing field.
Posted on Reply
#13
newconroer
PCpraiser100In some cases your right phanbuey, but there are some ways Nvidia uses to cut corners. My friend use to own a 6600GT and decided to play BF2 on full settings (BTW BF2 is an Nvidia game) and when we played it at full settings a big freeze suddenly occurred and then we saw that all the textures were less pixelated. Huge bust for Nvidia. And did you know that Crysis could optimize the 8800GT by renaming a certain config file in any way? Another bust. And then I played HL2 Episode One and CS Source on my borrowed friend's computer which had a 7900GTX and in the games HDR bloom was on steroids. But when I looked closely at the lit up props in the games the reflections and textures were not tat pixelated compared to other props around me in the game (full settings 1600x1200 by the way). Another bust for Nvidia as when I moved away from the HDR areas my framerates went down by a 30fps drop.
Now onto DX11, I am kinda glad they are developing it but it just worries me on the performance point-of-view that because Nvidia is the closer partner, they will be looking at the API's development closely. And since DX11 is one or two years away from now, Nvidia has got themselves a bit headstart in getting their GTXwahtever-the-heck series ready for DX11 games.
Seriously, hush and take it elsewhere..



I don't see a point of moving onto DX11, until they sort out DX10. Over half of the 'issues' we face through DX10 are due to lack of development exposure - unless DX11 is just going to be a rehash of the core architecture introduced by way of DX10, then ok, yet if it storms in with new pipes, that's just going to screw things up.
Posted on Reply
#15
Darkrealms
Nkdthis is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
Very True!
PCpraiser100Ya really, luckily the HL2 series is one of my favorite games as the Source Engine has scalable performance optimized for both sides in the generation:toast:. I mean like both companies have their unique strengths and weaknesses which sets them apart:slap:, why can't they just compete fairly with mediocre game engines that don't give a $h** for any side so that we all know who is the best on the block. I know for a fact that ATI will bust this whole graphics propaganda bull$h** especially for the position they are in. So by the looks of it because Microsoft french kisses Nvidia, they will replaced the DirectX ATI is really getting good at and then start with DirectX 11 announcements. Nvidia, you are a snitch SOB who thinks about living a lie because you can't handle the truth that you just bride your way out of situations.
LoL, you complain about Nvidia for working with MS on DX11, but ATI was the one working with MS on DX10.1. Difference being?
If Nvidia is smart enough to help developers out then they deserve to have companies willing to work with them . . .
PCpraiser100In some cases your right phanbuey, but there are some ways Nvidia uses to cut corners. My friend use to own a 6600GT and decided to play BF2 on full settings (BTW BF2 is an Nvidia game) and when we played it at full settings a big freeze suddenly occurred and then we saw that all the textures were less pixelated. Huge bust for Nvidia. And did you know that Crysis could optimize the 8800GT by renaming a certain config file in any way? Another bust. And then I played HL2 Episode One and CS Source on my borrowed friend's computer which had a 7900GTX and in the games HDR bloom was on steroids. But when I looked closely at the lit up props in the games the reflections and textures were not tat pixelated compared to other props around me in the game (full settings 1600x1200 by the way). Another bust for Nvidia as when I moved away from the HDR areas my framerates went down by a 30fps drop.
Now onto DX11, I am kinda glad they are developing it but it just worries me on the performance point-of-view that because Nvidia is the closer partner, they will be looking at the API's development closely. And since DX11 is one or two years away from now, Nvidia has got themselves a bit headstart in getting their GTXwahtever-the-heck series ready for DX11 games.
Perhaps ATI should start working with MS on this as well then?
Posted on Reply
#16
Solaris17
Super Dainty Moderator
I cant wait to play with this stuff!!!
Posted on Reply
#17
phanbuey
Solaris17I cant wait to play with this stuff!!!
LOL... mee either.
Posted on Reply
#18
zithe
I bet that as Nvidia finally moves onto DX 10.1, RV800 will be announced as DX11. XD!!
I'm holding out for the 5000 series. If the 5950 doesn't outperform 4870x2 (sounds hard to beat) then I'll get a cheap 4870x2 next year.
Posted on Reply
#19
candle_86
dont you remember the 5900 did beat the 4800 :P
Posted on Reply
Add your own comment
Jan 17th, 2025 01:11 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts