Monday, August 18th 2008
NVIDIA Demonstrates Real-time Interactive Ray-tracing
Ray-tracing is the buzzword with consumer and professional graphics these days. It's a technique with which accurate representation of light with its behaviour in adherence with the laws of physics can be done when generating 3D computer graphics.
NVIDIA took ray-tracing to an interactive level with its work on an interactive real-time ray-tracing application. Currently NVIDIA has a larger stash of intellectual property in the field of ray-tracing than other players such as AMD or Intel, with the acquisition of MentalRay, a company that is pretty-much a standard in Hollywood.At the Siggraph 2008 event, NVIDIA demonstrated a fully interactive GPU-based ray-tracer, which featured real-time ray-tracing in 30 frames/second (fps) and a resolution of 1920 x 1080 pixels. The demo saw NVIDIA flex its muscle with using almost every element in ray-tracing for which technology has been developed so far, namely a two-million polygon demo, an image-based paint shader, ray traced shadows, reflections and refractions.
To maintain those 30 fps at a high display resolution, NVIDIA used four Quadro FX 5800 graphics cards working in tandem. These next-gen Quadro boards are based on GT200(b) GPUs, come with 240 shader processors and 4 GB of GDDR3 memory (for a total of 960 shaders and 16 GB of GDDR3 memory).
Source:
TG Daily
NVIDIA took ray-tracing to an interactive level with its work on an interactive real-time ray-tracing application. Currently NVIDIA has a larger stash of intellectual property in the field of ray-tracing than other players such as AMD or Intel, with the acquisition of MentalRay, a company that is pretty-much a standard in Hollywood.At the Siggraph 2008 event, NVIDIA demonstrated a fully interactive GPU-based ray-tracer, which featured real-time ray-tracing in 30 frames/second (fps) and a resolution of 1920 x 1080 pixels. The demo saw NVIDIA flex its muscle with using almost every element in ray-tracing for which technology has been developed so far, namely a two-million polygon demo, an image-based paint shader, ray traced shadows, reflections and refractions.
To maintain those 30 fps at a high display resolution, NVIDIA used four Quadro FX 5800 graphics cards working in tandem. These next-gen Quadro boards are based on GT200(b) GPUs, come with 240 shader processors and 4 GB of GDDR3 memory (for a total of 960 shaders and 16 GB of GDDR3 memory).
93 Comments on NVIDIA Demonstrates Real-time Interactive Ray-tracing
If so... WOW I never thought I would hear something like that.
Ofcourse there are also many that don't remember as far back as 3dfx either : ( Nice example.
It might be something to be used in the future, but as long as programmers can "fool" the player into thinking the image looks "real" using less resources than described here, they won't give a rat's ass about ray-tracing...
The real advantage is in render farms for movies and stuff, where studios will be able to render entire movies in a fraction of the time. nVidia has one step ahead if it manages to make mentalRay a hardware rendering engine, but Intel's Larabee X86 basis might mean that all software rendering engines will get a significant performance boost. AMD is probably working on something similar with their Cinema 2.0 initiative also.
So the competition is getting fierce in this area too, that could mean real-time ray-traycing for games might come sooner after all :)
But besides that I think you are missinterpreting the realism I was talking about. The one that ray-tracing will bring on. The better example I can put here is Shrek. Shrek 3, is the last one I think, and it's very advanced in terms of the realism of lighting/shadowing and the shaders used. But you have to agree that because of that the fantasy feeling is not gone. Team Fortress 2 is just another example. Technologically is the most advanced Source game and in that way the most realistic one, yet it has that unique cartoonish feeling. One thing does not negate the other one IMO.
If ray-tracing is in possession to the devs, this could be a democracy for the console players. I barely notice the textures and crap if I am extremely focussed on the gameplay. If I really care about realism, I can the crap out of Crysis and go play Half-Life 2. Valve became a 70M dollar company this way, as they took the next generation of graphics and replaced them with stereotypes, making casual players spontaneously combust at a smaller packaged game. Ray-tracing was replaced mostly in the first Source engines with semi-triggered texture-mapping that instantly changed into a different texture pattern to mimic ray-tracing without the performance regrets, which came into play if you had the option to ignite fire-traps in the game. As for lighting, dynamic lighting was replaced with HDR-rendering, which had optimization tricks up its sleeve. HDR light entrails from crescent windows were just transparent 3D texture-tubing effects compared to dynamic lighting with a mixture of shadow and atmospheric effects. As for the blooming, that was the only performance remark that Source really had, since the blooming was mirrored by the water detail, which was one of the only ray-tracing effect in the Source engine. In 2007, dynamic shadow effects was introduced to Source, which was another ray-tracing effect that was supposed to scale more with the NPCs, which were one of the only things in the game to have 100% reactive shaders. So thats all I wanted to say, to tell you that this is good enough for me.
But no, seriously. I laugh every time I hear parents complain about the "realism" of the games their kids are playing. Like what happened with COD4. Your kid shouldn't be playing that game to begin with.
EDIT: We have different views. I like realism. I will tell you when it's going to be enough for me :D. When I am able to see an enemy around the corner reflected in the helmet or the fallen golden Desert Eagle of an enemy I just shot down, then it's going to be enough for me. :roll:
Nah, I lied that would probably be not enough yet. :p
I meant that I don't want games to get so graphically realistic that I might as well play the pc than look outside, Well it will be damn better looking than my street albeit but I love the cartoonish graphics of crysis and team fortress 2 (don't say crysis isn't cartoonish) it is in a less humerous and exuberant way but games like cod4 are trying to be realistic which is cool but takes out the fun.
Its a prime example of what can happen if developers make a game that can last forever w/o any apparent ending or goal. Graphics can get better & better, but if the game has a ending then ppl will stop playing it at some point or another...
Thats the thing about Blizzard though. They want anything to be able to play their games. You have to admit the graphics have improved quite a bit from what they started at in 2004.
Did I just date myself . . .
Check this:
www.pcgameshardware.com/aid,655742/News/Ruby_20_Screenshots_and_video_of_the_new_Radeon_tech_demo/
I like this one much much better :)
First of all, the Ruby one was on 720p while this one is at 1920x1200. You need 2,5x more power to render at 1920x1200 than at 720p.
Then I'm sure that the Ruby demo didn't have as much as 2 million polys.
Also they did it as a showcase of how beautiful a ray-traced scene can get, rather than like a realtime tech demostration like this one.
And finally but not least important, that demo was not interactive like this one. It makes a big difference.
All in all, both have demostrated they have a very valid real time ray-tracer and that is what matters.
As for the hardware complaints. Try to take a moment to wrap your brain around the tremendous processing power needed to perform the task of converting real-world lighting physics into a mathematical equation across an infinite number of surface variations and performing all of those calculations in real-time. Now add on to this the additional processing requirements of the program itself (the game), the physics, the other graphics, the shading and texture, etc. etc. etc. and you might be able to understand why so much hardware was needed for this.
It's impressive when you look past your simple gamer mentality (which I admit I also did initially) of expecting max frames per second with gorgeous texture and particle effects and try to only focus on what the demonstration is showing off.
when ray tracing becomes like that wineglass picture and runs on affordable hardware ill be impressed
it should be about the graphics as a whole not just lighting ect
Ruby 2.0 video someone posted is UBER, MUCH MORE IMPRESSIVE!!!
Also, keep in mind the demo was @ 1920x1080 and the pics there are 800x500. I can guarantee you the original image would have looked a LOT better. Besides that, those whining about the "poor graphics" are missing the entire point of what this demo is about.
As said by others above, we have no idea what that ATi video is running on (3/4 x 4800x2's?). Its very pretty, but alot of it seems to be the depth of field/field of view/camera blur applied to the scene, (In the background) not the raytracing. Epic Fail. Thats what were saying. Ray tracing is like that. Everyone should know that. Thats not what Nvidia is showing off, there just showing they got the tracing down in real time. They can throw in all the stuff we ALREADY KNOW IS OUT THERE whenever they feel like it. Its not that hard to drop on a 2024x2068 texture over the damned car.
This is a TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH TECH Demo not a game. Keep focused on what its showing you.
Now the question is, how many rv770 needed for this ruby demo. 2, 3 or 4? Anyway it will cost much less.
Which one looks better? Definitely Ruby. Tesselation.
About the hardware, we don't know the one used by Ati, but those Quadros are nothing else than GTX cards. So in reality they just used 4 GTX280. Sure, they are more expensive than RV770 derivatives but not by that much and they are really doing a lot more work than on the Ruby demo.
And again, as many many many others have said it doesn't matter how it looks. What looks better real games or 3Dmark?
Hint: 3Dmark is not interactive.
And about tesselation. Yeah they used tesselation and that is good, as it increases the detail level with minimal performnce penalty (compared to actual polyfons), but tesselation on the other hand very hardly adds details (and also very hardly can be animated), it just makes things smoother. And that's a very different thing. Nvidia demo demostrates that they can use 2 million of actual polygons that can be transformed and animated. The Ruby demo even if it had 10 million polys after the tesselation was applied doesn't mean anything special, as the 2 ones of the Nvidia demo are far more interesting from a real performance point of view.