In regards to developing some AR applications for the device, I am wondering if the relative geometries between the virtual screen on the glass and the camera's view are specified and defined anywhere? That is, is the screen in a fixed, known position in relation to the camera's cone of vision?
What are the relative geometries of the Google Glass camera and screen?
251 Views Asked by James AtThere are 2 best solutions below

The sensor coordinate system for Google Glass is documented here by Google. Note that the optics-pod part of Google Glass carries the sensors. There is an admonition that since users may adjust the optics-pod the relationship to where the user is looking may not be precise. Here is the excerpt from this Google Glass documenation
Here's some tips when using sensors on Glass:
The Glass sensor coordinate system is shown below relative to the Glass display. For more information, see sensor coordinate system.
The accelerometer, gyroscope, and magnetometer are located on the optics pod of the Glass device, which users rotate to align the device with their sight. You cannot measure the angle of the optics pod directly, so be aware of this when using angles from these sensors for applications such as compass heading.
To preserve battery life, only listen to sensors when you need them. For example, if your Glassware uses a Service to render a LiveCard and you only need the sensors when the live card is visible, use the LiveCard surface callback methods to start and stop listening to the sensors.
Nope, none of this information is specified as part of the official API surface, but I bet with a little experimentation you could figure it out.