I am using raw input as my input api for my game and I was looking to get time stamps for the input presses. But everything I find is at millisecond precision (GetMessageTime()). Which is not good enough for my application where every frame is running in 9 ms (VR time budget) and I have to think about speedrunners. meaning that i cannot accurately differentiate different input message in time within a frame if I get multiple input messages within that single frame. This is problematic for all sorts of delta time calculations base on input too, and especially bad in VR where you need accurate time stamps to predict future frame timing. So is there a way to get at least microsecond (nanosecond if possible) precision per input message when it occurred through rawinput. Or do I have to do input through some other win32 api or system.
thanks in advance!
here is how I setup raw input
//do we want the input sink?
RAWINPUTDEVICE Rid[2];
//Mouse
Rid[0].usUsagePage = (USHORT) 0x01;
Rid[0].usUsage = (USHORT) 0x02;
Rid[0].dwFlags = RIDEV_INPUTSINK | RIDEV_DEVNOTIFY;
Rid[0].hwndTarget = WindowHandle;
//Keyboard
Rid[1].usUsagePage = (USHORT) 0x01;
Rid[1].usUsage = (USHORT) 0x06;
Rid[1].dwFlags = RIDEV_INPUTSINK | RIDEV_DEVNOTIFY | RIDEV_NOLEGACY;
Rid[1].hwndTarget = WindowHandle;
RegisterRawInputDevices( Rid, 2, sizeof( Rid[0] ) );
then in my message loop
while(bRunning)
{
MSG Message;
while( PeekMessage( &Message, 0, 0, 0, PM_REMOVE ) )
{
switch( Message.message )
{
case WM_INPUT:
{
LONG msTime = GetMessageTime();
//...do stuff with the time stamp and input
break;
}
default:
{
TranslateMessage( &Message );
DispatchMessage( &Message );
break;
}
}
}
}