Thursday, May 22nd 2008
NVIDIA Acquires RayScale Software
PC Perspective sends word to us that during a meeting in San Jose, NVIDIA has announced that it is to aquire a ray tracing software company called RayScale. Interesting acquisition indeed, but little is known at the moment. There's still no press release issued. RayScale starts its life from the University of Utah, and currently it provides interactive ray tracing and photo-realistic rendering solutions. Ray tracing is a technique for rendering three-dimensional graphics with very complex light interactions. More info about RayScale can be found on their web page here.
Source:
PC Perspective
25 Comments on NVIDIA Acquires RayScale Software
also someone may know but isnt ray tracing much more cpu intesive, more cores=better performance? (in reguards to ray tracing)
ray tracing is pretty cool stuff though.
Anyway ray tracing for second level reflexions is cool, I think (I hope) Nvidia is not going to aim at complete ray traced renderers. No game developer wants that from what I heard and are bloody slow renderers.
since ms isnt likely to move away from x86-64 any time soon, that leaves them stuck with amd as a cpu supplyer, unless they can somehow help via make better cpus :P
by eliminating smaller companies nvidia is trying to reinforce its winnings in the duopoly of ati-nvidia... instead of letting rayscale etc enter the market and become a player in the market (good for the consumer), nvidia is buying it out to strengthen itself so that it can beat intel and amd and form a virtual monopoly -_-
graphics.cs.uni-sb.de/~sidapohl/egoshooter/
Even a lowly game like quake 3, when ray traced, looks pretty nice.
They are not competing in the GPU bussines, they were small players in the small bussines of off-line renderers. They have nothing to do with Ati-Nvidia 's market at all. Just as with Mental Images, also owned by Nvidia, they will probably continue with the bussiness they are doing today, but with more money. And at he same time they will help Nvidia introduce real-time ray-tracing in games as it has to be introduced and not the way Intel is trying to do, badly.
I think the primary reason for this aquisition is a nextgen rendering technology that will unite Mentelray - Gelato and RayScale now.
Intel: the GPU gives no benefit to non-gamers. Y'not wrong, Pixar's films can takes days to render just a few minutes of a scene, for example, something with an explosion or the a-like.
and intels raytracing thing,yeah the future 5-10years down the road maby, MABY, but by then cpu's will not look like they do now, i honestly cant see the x86 core as it is now being around that long, alot of the old x86 design is lagacy now rarely used other then for VERY old apps, hence slowly amd and intel are both removing that detocated prosessing power and just running it emulated on other parts of the cpu.
i see VIA headed in the right dirrection, RISC is the future, less complex cpu's that are optimized to run the software thats commonly used, but that are still capable of running other types of software and older software via emulation(sure theres a perf hit, but most of those apps are so old or low power they still run better then they did when they where made)
i would love to see what the k9 would have been, all i heard about it was that it was based on alot of concepts from the dec alpha(designed by the same team) but somebody in management killed the project, and as i hear it the dessission had nothing to do with performance, but the fact that it wasnt based on the k8, it was a new design using RISC insted of CISC(reduced instruction set computeing vs complex instruction set computing) and somebody in management coudnt understand that this was a good thing and would make future development of the cpu FASTER since it was less complex, also would make each core cost less and take up less die space, OR would make more rool for new fetures.
blah, this move to me is equivlant to the idiot at intel that choose to go with netburst over p6 core(pentium pro/p2/p3/pentium-m/core/core-duo/core2 are all based on the P6 design) bone head people who cant let go of stupid thoughts.
worst part was amd lost their head engineer to that because he was pissed about his project being cancled for such a stupid reasion......
and NO the am2 chips are NOT k9, some board makers refer to them as such, but they are NOT a k9 chip, they are a k8 with am2 memory controler PERIOD.
Also they are not talking about making a 8800 or even GT200 do ray-trace, but the next or even two or three generations ahead. Each generation the power of cards is doubled, think about three generatios ahead and you have a card with 8x the power of a 8800. For what do you want that power? 3000x2000 pixel resolution? It's hard to notice any difference beyond 1920x1200. 16-32x antialiasing? 4X is all you need, specially at 1920x1200 and above. Anisotropic? At higher display resolutions even 16x is not indispensable with the high resolution textures of latest games...
In reality that power will have to be used in the detail of the picture rendered. But geometry and textures with 8x the detail of today don't make sense neither. 2x maybe. Now some use 2048x2048, most use 1024x1024. 4096x4096 (4x) is not even used in production graphics.
What's left? Shaders, lighting... Improving rasterizers by 8x is not possible you would find a wall in perception, just as the ones I mention in the second paragraph. So adding something new is the only way to go, that's including ray-trace for reflexions, massive amounts of high quality physics, etc.
That's what Nvidia is doing, and I love it. I hope AMD is doing the same with the same commitment, even though we didn't hear anything about it.
They acquired ULi, ended up making a decent low-end chipset line. Acquired Ageia and now have big plans with PhysX. Now this.