I've developed a project in unity and I'm using the Holographic remoting player to make it run on Hololens2. I would like to use the stream recorder app in HoloLens2ForCV to collect eye movement data while running this unity project and export it to CSV file for research and analysis. I have deployed the stream recorder app's .sln file to Hololens2 through visual studio. However, I have encountered two problems and would like to ask you geniuses how to solve them:
- When I open the Holographic remoting player application, the stream recorder app is shut down. That means I can't run my project and collect eye movement data at the same time. If anyone has done this successfully, can you tell me how to do it?
- When I run the stream recorder app alone, it collects some data for me and I can access it , but the output I get is some untitled data. (According to the script, I seem to get more than 100culumns of data, but the output file I get has almost 1000culumns of data) I can't match them perfectly with the type of data that is decoded in the program code, so I can't use them. I didn't find any similar question on the internet, am I the only one who is experiencing this problem?
Regarding the untitled data, I searched for clues from the code of the data collection app and then calculated the amount of data I could potentially collect, but I was never able to match the large numbers I got, I asked chat GPT and Microsoft Q&A about this, but didn't get a helpful answer. Regarding the stream recorder app not running alongside the Holographic remoting player, I'm honestly not sure where to start trying to solve this. I'd like to ask researchers who have used HoloLens2ForCV for research, how did you solve these problems?
The StreamRecorder app serves as a sample for demonstrating how to capture data. However, it cannot be used to track data while another app is running on the HoloLens device.
For head, hand, and eye tracking data, you can refer to the following links: https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk2/features/input/gaze?view=mrtkunity-2022-05 - how-get-the-current-gaze-target
Eye Tracking Eye Gaze Provider - MRTK 2 | Microsoft Learn
IMixedRealityHand Interface (Microsoft.MixedReality.Toolkit.Input) | Microsoft Learn
You can use these interfaces to access these data in your application.
Regarding the recorded CSV file, unfortunately, there is no detailed documentation explaining the meaning of each column. However, you can explore the Python scripts within the StreamRecorderConverter for postprocessing the CSV file.