# (We introduce) CPUGFX - Using CPU Cores to improve visual experience



## minx (Aug 8, 2014)

Hi!

Currently, we're devloping an experimental framework called CPUGFX.  CPUGFX is a framework which enables developers to utilize a custom number of CPU cores exactly the same way as a GPU.

An example configuration:





Assume a PC using an Octacore CPU runs a game that puts nearly 100% load on the GPU, but not on the CPU. You can now use CPU cores which are in idle mode to calculate additional, non-crucial, effects to improve the players visual experience. CPUGFX' performance (what a surprise) scales with the number of cores available.

Shaders and Effects which should be calculated using the CPU are just passed to the CPUGFX framework which takes care of translating the shader source (e.g. GLSL) to CPU instructions. CPUGFX adds no overhead once the application is compiled, because the shader will be precompiled.

Small and fast Shaders that rely on CPU-based Physics can be offloaded to CPUGFX in order to eliminate GPU-CPU communication overhead (expect for APU / Integrated Graphics).

Here is a famous GLSL shader, compiled by CPUGFX, that creates a realistic iris (this is just a snapshot, the iris reacts to light exposure when used ingame):




Following is a sandbox test of the CPUGFX iris shader in a DirectX test enviroment. This test uses the CPU simultaniously for both smoke physics and the iris shader:



(we know that this is not a realistic scene  )

The state of this project is: It works in the most cases.  That means, we have a lot to improve, before we can create a serious tech demo, but we are not that far away.


----------



## OneMoar (Aug 8, 2014)

software rendering you aren't serious .... rofl


----------



## minx (Aug 8, 2014)

OneMoar said:


> software rendering you aren't serious .... rofl



We are. Using spare power to improve on performance is a serious topic.


----------



## OneMoar (Aug 8, 2014)

minx said:


> We are. Using spare power to improve on performance is a serious topic.


modern game engines such as unreal 4 and frostbite 3 already do this to a extent


----------



## minx (Aug 8, 2014)

I'm aware of that. This is just a new way to do this and it's easy to include this functionality in all sorts of engines. Not only game Engines, but every DirectX or OpenGL Engine. We're developing on a lower level than I think you think we are. We are developing Driver plugins, modifying the underlying mechanisms of every engine.


----------



## Athlon2K15 (Aug 8, 2014)

Dont mind OneMoar he is a troll.


----------



## minx (Aug 8, 2014)

AthlonX2 said:


> Dont mind OneMoar he is a troll.



Well I hope he isn't, judging by his post count  </sarcasm>.

We're looking forward to release a public tech demo, but as of now, the running examples have way too much dependencies to be run on another machine  . Plus: We're coders, not 3D artists, so it takes a bit longer to produce something visually appropriate


----------



## Mussels (Aug 8, 2014)

until this actually works, you're mostly going to get trolls on the internet. too many companies with genius ideas generating hype for investors, then nothing ever comes of it.


----------



## minx (Aug 8, 2014)

Mussels said:


> until this actually works, you're mostly going to get trolls on the internet. too many companies with genius ideas generating hype for investors, then nothing ever comes of it.



Like money is involved in this  . That's rather a do-because-we-can and I don't think you'll see this officially supported by anyone or anything ever. We do this in our free time and we'll open-source the result.


----------



## MxPhenom 216 (Aug 8, 2014)

This sounds like something Nvidia would try to do. Cool.


----------



## Mussels (Aug 8, 2014)

MxPhenom 216 said:


> This sounds like something Nvidia would try to do. Cool.




wasnt physX the exact OPPOSITE of this? XD


----------



## Aquinus (Aug 29, 2014)

I know I'm late to the party, but I felt I should comment.

It's an interesting idea, but I'm not sure if the added complexity is going to actually yield tangible benefits. Shaders and GPGPU works well because they're all exactly the same, there are a ton of them, and they're very close together on the die. If you start adding a CPU into the mix, you're adding coordination overhead, you're adding frame latency to compose the entire scene, and you're in general making the engine bigger (code wise) and harder to change.

I personally would advocate for a more simple 3d engine because as it stands right now, programming OpenGL in any language other than C/C++ is a bear and using XNA complicates Linux support.

As a developer, I would like a 3D library that's simple, fast, and has an API that matches the paradigm of the language it's written in. Personally, I develop Clojure which is a JVM language with all the libraries that the JVM has to offer in addition to Clojure libs. Unfortunately there are very few OpenGL wrapper libraries for Java and the ones do exist are a one-for-one mapping of OpenGL C functions to Java, which is sub-optimal.

Sometimes making the engine (or any library of piece of software) more complex is only a hindrance to the developer who has to work with it.

With that all said, I hope you gain something out of this but I personally don't think it will yield the results you intend to gain. Good luck.


----------



## Mussels (Aug 30, 2014)

Any updates on thhis Minx?


Or is this turning into another vaporware thread...


----------



## minx (Aug 30, 2014)

Nope. I'm in hospital right now because of some serious eye sickness. Thus i can't stare at my screen as long as I'd like to. Sorry for any delays, but yes, we are still working on this.


----------



## Suka (Aug 30, 2014)

Well get well soon  and looking forward to the final product


----------



## eidairaman1 (Jan 11, 2015)

Anything?


----------



## Mussels (Jan 11, 2015)

another magical vapourware thread! wooooo


----------



## TRWOV (Jan 11, 2015)

MxPhenom 216 said:


> This sounds like something Nvidia would try to do. Cool.



nVidia would buy the tech, release a new version incompatible with previous hardware, throw all the old users under a truck and then forget about it 


Back on topic, wouldn't this add a little bit of latency to frames?

edit: fuuuuuuuu.... didn't see the date. Forgive the necroposting.


----------

