I wanna render VR from a native plugin. Is this possible?
So, far, what I have is this:
void OnRenderObject()
{
GL.IssuePluginEvent(getRenderCallback(), 1);
}
This is calling my native plugin and rendering something, but it makes this mess here.
It seems that I am rendering to the wrong texture. In fact, how to get the correct texture/framebuffer or whatever trick necessary to render VR from the native plugin OpenGL? For example, I need information about the camera being rendered (left or right) in order to render properly.
Any hints? I am trying with both Gear VR AND Google VR. Any hint about any of them is welcome.
My interpretation is, instead of Unity's default render from MeshRenderer, you want to overwrite that with your custom OpenGL calls.
Without knowing what you are rendering and what should be the correct behavior, I can't help more.
I can imagine that if you set up your own states such as turning on/off particular depth tests, blends, these would overwrite whatever states Unity set up for you prior to your custom function. And if you don't restore those states Unity is not aware and produces incorrect result.
As a side note, you may want to check out https://github.com/Samsung/GearVRf/ It's Samsung's open source framework engine for GearVR. Being similar to Unity but open source, you may do something similar to what you posted while knowing what happens underneath