I'm working on a SlimDX apps which works for multiple display. The apps will occupy the selected display, and it's selection is input via commandline, as int. I then use System.Windows.Forms.Screen.AllScreens[selection]
to find out about the bound, and display my apps "fullscreen" on that display.
Now, to optimize performance, I need to select which gpu adapter to initialize the Direct3D's device. How do I find out which gpu adapter is powering the selected display?
Since Each gpu adapter might have one to two display connected, I can't use the display number.
I'm utilizing Direct3D10. I don't mind solution in Direct3D9.
Worse case would be to let user select display and adapter via commandline, but I prefer a fool prove method.
Thanks
Both D3D10 and D3D11 use DXGI for managing details like this. The Factory interface you create lets you get a list of adapters installed on the system. Each adapter can have one or more outputs, which you can enumerate from the Adapter interface.
This will get you a list of Output interfaces, which have a Description property that contains, among other things, a Rectangle of the output's bounds as well as an IntPtr handle to the monitor on which it is running.
I don't think the Winforms Screen class exposes the underlying native pointer, which is why we provide the SlimDX.Windows.DisplayMonitor class to serve as a replacement for the Screen class. You can use this to determine the particular details of the display and choose the right adapter for your needs.