# Possible to force my Java app to use discrete video adapter?



## grecinos (Jun 22, 2017)

Hey guys,

I have a java app (with animation).  It runs best on a PC with a discrete GPU.  The problem I'm having is that I have a PC with both an integrated and discrete video adapter.  It uses the integrated GPU by default.  Unfortunately, this is the case when running my java app.  So it runs much slower.  My other PC has a single discrete video adapter and runs the same app, faster, by about a factor of 10.

Any ideas on how to force the app to use the discrete GPU?

TIA,

grecinos


----------



## Sasqui (Jun 22, 2017)

Does this help?

https://android.googlesource.com/pl...oid/server/display/DisplayManagerService.java


----------



## grecinos (Jun 22, 2017)

Sasqui said:


> Does this help?
> 
> https://android.googlesource.com/pl...oid/server/display/DisplayManagerService.java




Thanks for the quick response.  That looks close, but at first glance it appears to be for android devices. I'm running my Java app on a PC. I'm using Netbeans as my IDE.  Off the top of my head, I was thinking either a special compiler directive.  Or, to create an executable from the Jar and force that to use the Nvidia GPU.

Any thoughts?


----------



## Sasqui (Jun 22, 2017)

grecinos said:


> Thanks for the quick response.  That looks close, but at first glance it appears to be for android devices. I'm running my Java app on a PC. I'm using Netbeans as my IDE.  Off the top of my head, I was thinking either a special compiler directive.  Or, to create an executable from the Jar and force that to use the Nvidia GPU.
> 
> Any thoughts?



Yeah, duh didn't even see android in the link.

Sorry, not here.  You could be right, it might be set in the compiler.


----------



## grecinos (Jun 27, 2017)

A quick update:

I was unsuccessful to force Windows 10 to use my Nvidia GPU for my Java app. 

Fortunately, Java is a cross-platform programming language.  In such, I figured I'd try running the app on Ubuntu Linux.  I created a dual boot system out of one of my laptops.  After installing all necessary software, I ran my Java app. To my amazement, it worked!  Apparently, Ubuntu favors the discrete GPU over the integrated counterpart.  My app went from performing 250K calculations per second to over 7 Million calculations per second.  That's the difference between using an integrated vs discrete GPU.

Now, if only Microsoft would allow its users to select a preferred GPU.  Shame on you!


----------



## Ferrum Master (Jun 27, 2017)

grecinos said:


> A quick update:
> 
> I was unsuccessful to force Windows 10 to use my Nvidia GPU for my Java app.
> 
> ...



It usually does via right click and select preffered GPU.

Blame your maker of the board, not M$.


----------



## grecinos (Jun 27, 2017)

Ferrum Master said:


> It usually does via right click and select preffered GPU.
> 
> Blame your maker of the board, not M$.



Well, I have two laptops (by Asus and Sager), both with a dual GPU configuration.  Neither has the option to select a preferred GPU, including the BIOS.  I've tried creating an EXE out of my Java app and selecting the preferred GPU that way, but no dice.  Is that what you were referring to?


----------



## Ferrum Master (Jun 27, 2017)

grecinos said:


> Well, I have two laptops (by Asus and Sager), both with a dual GPU configuration.  Neither has the option to select a preferred GPU, including the BIOS.  I've tried creating an EXE out of my Java app and selecting the preferred GPU that way, but no dice.  Is that what you were referring to?



It usually looks like that.


----------



## grecinos (Jun 27, 2017)

Ferrum Master said:


> It usually looks like that.




Yep.  I tried that.  When I called Asus tech support, he ran me through some steps.  That was one of them.  The other was to change the Physics Configuration and Mange 3D Settings in the Nvidia Control Panel.  None of them worked.  He eventually gave up and said that it just wasn't possible.  It appears to be very picky which app is allowed the use of discrete GPU.  =/


----------



## Ferrum Master (Jun 27, 2017)

So you have the option?

If you activate nvidia GPU activity bar, does it show other apps that use the GPU?


----------



## Vya Domus (Jun 27, 2017)

Can't you just disable the integrated GPU in device manager ?


----------



## W1zzard (Jun 27, 2017)

https://github.com/LWJGL/lwjgl/pull/132/files?diff=split

With those dllexports your app will use the high-performance GPU. Not sure how to dllexport in Java though.


----------



## Ferrum Master (Jun 27, 2017)

W1zzard said:


> https://github.com/LWJGL/lwjgl/pull/132/files?diff=split
> 
> With those dllexports your app will use the high-performance GPU. Not sure how to dllexport in Java though.



I have a feeling it just needs to set the java exe files to GPU not the end application...


----------



## grecinos (Jun 27, 2017)

Vya, I tried disabling the integrated GPU, but Windows didn't hand over control to the discrete GPU 

W1zzard,  I'm not sure how to dllexport in Java, either.

Ferrum,  some how Java has to allow permission to use the discrete GPU.  Perhaps a compiler directive.  Maybe a special program that converts a Java app to an executable for the sole purpose of using the discrete GPU.

So frustrating...


----------



## Vya Domus (Jun 27, 2017)

grecinos said:


> Vya, I tried disabling the integrated GPU, but Windows didn't hand over control to the discrete GPU
> 
> W1zzard,  I'm not sure how to dllexport in Java, either.
> 
> ...



Thats really strange that even disabled it still runs on the integrated GPU.

To be honest Java is not fit for whatever computations you want to achive on the GPU in the first place.


----------



## grecinos (Jun 27, 2017)

Vya Domus said:


> Thats really strange that even disabled it still runs on the integrated GPU.
> 
> To be honest Java is not fit for whatever computations you want to achive on the GPU in the first place.




I agree.  I think Windows should have an option to select which GPU to use by default.

For the moment, java suits my needs just fine.  It's sufficiently fast and an easy programming language to use.  The fact that it's a cross-platform language is also a bonus.  Although, I've always wondered if there was such a thing as a library that allows for accessing the GPU directly.  I guess, something like DirectX for Java would be neat.


----------



## Ferrum Master (Jun 27, 2017)

If javapsp exist there is.

Also there should be a really good reason why it acts so really.


----------



## Vya Domus (Jun 27, 2017)

I am still baffled to why you can't disable your Integrated GPU , I am convinced that's the cause of your issue , your application is most likely fine. You should look more into that.


----------



## grecinos (Jun 28, 2017)

Vya Domus said:


> I am still baffled to why you can't disable your Integrated GPU , I am convinced that's the cause of your issue , your application is most likely fine. You should look more into that.



I stand corrected.  After disabling the Integrating the GPU, the Discrete took control.  I'm not sure why it didn't work before...  Nevertheless, my app runs nearly as fast compared to running it in Linux.  Approximately 5,500,000 calculations per second vs Linux's 7,000,000.   The only drawback is that when the Integrated GPU is disabled, I can't access the Nvidia Control Panel.  Not a big deal, as I will only need to put my computer in this state in rare situations like this.  It appears to be stuck at the highest resolution, 3840x2160 on the Sager and 1080p on the Asus.  On the bright side.  My app was intended to run at maximum resolution.  Not the most elegant solution, but hey it works!


----------



## Vya Domus (Jun 28, 2017)

Glad you at least fund a way to make it work.


----------



## grecinos (Jun 28, 2017)

Vya Domus said:


> Glad you at least fund a way to make it work.





Vya Domus said:


> Glad you at least fund a way to make it work.




TY


----------

