Monday, August 18th 2008
![NVIDIA](https://tpucdn.com/images/news/nvidia-v1739475473466.png)
NVIDIA Demonstrates Real-time Interactive Ray-tracing
Ray-tracing is the buzzword with consumer and professional graphics these days. It's a technique with which accurate representation of light with its behaviour in adherence with the laws of physics can be done when generating 3D computer graphics.
NVIDIA took ray-tracing to an interactive level with its work on an interactive real-time ray-tracing application. Currently NVIDIA has a larger stash of intellectual property in the field of ray-tracing than other players such as AMD or Intel, with the acquisition of MentalRay, a company that is pretty-much a standard in Hollywood.At the Siggraph 2008 event, NVIDIA demonstrated a fully interactive GPU-based ray-tracer, which featured real-time ray-tracing in 30 frames/second (fps) and a resolution of 1920 x 1080 pixels. The demo saw NVIDIA flex its muscle with using almost every element in ray-tracing for which technology has been developed so far, namely a two-million polygon demo, an image-based paint shader, ray traced shadows, reflections and refractions.
To maintain those 30 fps at a high display resolution, NVIDIA used four Quadro FX 5800 graphics cards working in tandem. These next-gen Quadro boards are based on GT200(b) GPUs, come with 240 shader processors and 4 GB of GDDR3 memory (for a total of 960 shaders and 16 GB of GDDR3 memory).
Source:
TG Daily
NVIDIA took ray-tracing to an interactive level with its work on an interactive real-time ray-tracing application. Currently NVIDIA has a larger stash of intellectual property in the field of ray-tracing than other players such as AMD or Intel, with the acquisition of MentalRay, a company that is pretty-much a standard in Hollywood.At the Siggraph 2008 event, NVIDIA demonstrated a fully interactive GPU-based ray-tracer, which featured real-time ray-tracing in 30 frames/second (fps) and a resolution of 1920 x 1080 pixels. The demo saw NVIDIA flex its muscle with using almost every element in ray-tracing for which technology has been developed so far, namely a two-million polygon demo, an image-based paint shader, ray traced shadows, reflections and refractions.
To maintain those 30 fps at a high display resolution, NVIDIA used four Quadro FX 5800 graphics cards working in tandem. These next-gen Quadro boards are based on GT200(b) GPUs, come with 240 shader processors and 4 GB of GDDR3 memory (for a total of 960 shaders and 16 GB of GDDR3 memory).
93 Comments on NVIDIA Demonstrates Real-time Interactive Ray-tracing
yesss. I used the the phrase I think 3 times above and I dont think anyone cares what I think.
Also in the hardware department, I should have learnt from what some people said about UT3 Engine when one of the developers was asked about the graphics card he used to test it on and how it ran. He said a Quadro and decent fps. I can't remember which one, but I do remember the price of it at the time: $2000+, it was a 7900 with more RAM. Guess what people said. ;)
#1: it took 4 video cards to do this, instead of TONS OF COMPUTERS IN A SERVER FARM. Its a LOT faster now and possible on ONE machine, as opposed to lots.
#2: The reference to HDR. everyone disliked it at first too, and it was a performance killer. now its almost free and its tuned in the way people like it. Shadows in games are the same - now we're getting reflections. Imagine 100% perfect shadows added to your favourite game now.. remember its going to be ADDED into existing things, its not taking anything else away!
#3: it was INTERACTIVE. Thats a huge step from a simple non-interactive movie.
#4: darkmatters post right above this one. the UT3 engine was developed on a $2,000 card... that was a 7900. Give it 2 years and this amount of power will be nothing! look at a GTX280 now compared to a 7900 back then
Although you need a bit of knowledge of what ray-tracing is to understand this point. I found this link "Ray-tracing for the masses" that explains how it works, for those who don't know and want to learn.
www.cs.unc.edu/~rademach/xroads-RT/RTarticle.html
After reading it what you have to understand of the demo is this:
1- All the ray-tracing work (which involves reflection/refraction testing) is already done in this Nvidia demo. It's by far the most demanding task a ray-tracer must do.
2- At every surface found by the rays, a COLOR has already been picked through a texture load. Whether that texture is of good quality or not (or if it is modified by a shader) won't matter so much, the performance is almost the same as the task of fetching the textures already happened in the demo.
3- Many of the effects you can add are not as costly as the geometry surfaces. i.e. If you add fog, fog does not have reflection or it is usually small enough that would be a waste of resources to test it. That means that ray splitting has not to be done, it's just applying a transparency (with refraction, of course, this is ray-tracing after all :)).
Or blurring if you prefer, blurring in ray-tracing just means changing the direction of the rays, which is what actually happens in reality.
And that's it. That's why it doesn't matter if the demo looks pretty or not. It's quite an achievement and should be understood as it is. Hope this helps. :toast:
and over some pictures, , how do you "actually" know how much workload is actually on these cards:laugh::laugh:
they can say its how many polygons they want but we should all know how these PR games usually play out
also www.tgdaily.com/content/view/38145/135/
edit: direct from AMD( www.amd.com/us-en/Corporate/AboutAMD/0,,51_52_15438_15106,00.html?redir=cin01 )
Rendered in real-time and interactive, this is a brief video from the first Cinema 2.0 demo, premiered by AMD in San Francisco on June 16, 2008. The interactive demo was rendered by a single PC equipped with two "RV770" codenamed graphics cards powered by an AMD Phenom™ X4 9850 Processor and AMD 790FX Chipset. The full demo shows cinema-quality digital images rendered in real-time with interactivity. Check back later this summer for a video of the full Ruby Cinema 2.0 demo
Are you propping up Nvidia's demo just because your a fanboy or something? Because thats what your coming off as.
The reflections in the Nvidia demo are limited to the few colors used in its environment, there is a reason all the buildings are grey and few color on anything else.
The ATI demo is just more robust, even you can see that.
Still its good to see both companies moving forward with some very nice technology.
You might want to read some before you jump in to a thread like this.
People are allowed their opinions and should not have to deal with ass hat responses from a dweeb hiding behind his PC.
Answer things with respect to others, knowledge is a good thing to share but you aren't coming off as knowledgeable just everything you've accused others of being.
And I could debate the entire light and the way it reflects and is handle in true environments and that Nvidia did a very, very simple demo but you have proved that your not up to that.
The 4870x2's codename is r700 not rv770.
Of course ATI's will look better - its DESIGNED to. Nvidias was PURELY ray tracing.
Its like saying HDR is crap simply because in the earliest of tests, they didnt include any other shader effects to go along with it - it was plain and boring. Duh, its early stages - they can leave out whatever they want for tech demos as its about the TECH, NOT about 'looking good'
Nvidia: 30fps in raytracing without extras with four uberexpensive 4gb cards.
Ati: 25+fps in raytracing + extras with two cheap cards.
Am i correct?
Hmm, so possibly, when I speak about the topic at hand...in a thread about the topic, its no more bios than that I am on topic hmm?
I do like ray tracing, it is how our eyes create the environment we see. It IS the future of rendering. That is something people need to understand. I would like them to understand that Nvidia was trying to show Ray Tracing and that was the only thing, hence the scene was not painted with effects. But if people dont get it to bad.
People can have all the opinions they want and they can post them as much as they want. Including saying anything they want about the aesthetics of this demo.
Im sure as hell not getting into a pissing match with you. If you want to delve into what you think of me by all means continue, you will piss off a moderator pretty quick pulling the thread off topic. Which sounds like a good idea before you get to your 20th post. This would be a good time to shut up or interject something useful into the thread. That is more impressive I must say, I was under the impression the rv770 was the x2 card. I do wonder if havok has anything to do with ATi's ray tracing... You just made me feel allot better, I have felt like im talking to a brick wall today and yesterday with different people posting the same things redundantly that have already been covered in various threads and then the same responses follow...I'm glad someone knew what I meant.
*It doesn't matter if that texture is a beatiful high detailed one or a completely grey one. In both cases the operation "LOAD_TEX (x,y)" or whatever the name it has, DOES occur. The "performance hit" is the same in both cases. Understand this for once before continuing with the nonsense and the acusations FFS.
Another thing to consider BEFORE talking about the hardware used is that this demo was at 1920x1200 while the other one was on 720p. And forget about resolution scalability of raster renderers, ray-tracing is 100% dependant on resolution. You can't compare both demos because they are very different, and both are as impressive. In a manner, this one is even more impressive, as I remember reading that Ati couldn't achieve such high resolutions no matter how much hardware they used.
And finally, YES, Ati hardware NOW is a lot better suited for ray-tracing becuse it has tons of shaders and a tesselator. Yet it can't do ray-tracing on a level it would be useful for games. This is a ray-tracing demo, not a demo to see which hardware can run it. Whenever the renderer software is prepared, hardware capable of running it will be released. Or do you honestly believe Nvidia is stupid or that they can't put as many shaders as Ati on one chip? When ray-tracing is mature enough Ati, Intel and Nvidia will all have their capable hardware, don't worry.
I guess that makes since, it was a tech demo so it would make since that they used 2 workstation cards vs. a Hd series card. Very Very true, my monitor looks better at 1680x1050 than 1920x1200 anyway. +1 for me :) (and 10%+ Performance increase)