Osiyo. Dohiju? Hey, welcome back.
Since opening up the source code for people to play around with I’ve been working on the actual HMD portion. I’ve taken my Vufine AR apart and have started to mock up the headset with cameras and the display. Today I will put up the streaming cameras to document the processes and refinement processes.
I was going to focus on a bunch of functionality that I’ve listed in the Bitbucket README. However, I think what I want to do first is get the headset the way I want it. Then I can work with the display as it actually will be. I’ll also work on the gesture portion and gaze tracking because those are going to be important. At least if they’re in place in the design then I can work with them to get them working.
I came across this project (Triton and here ) on reddit this morning. Infinitely relevant this morning especially since I’m working on something similar today. Today’s task is actually to get the headset put back together with a LeapMotion and get hand tracking via LeapMotion (I already have OpenCV working), and get the eye gaze cameras aligned and tracking. Those are my only tasks for this project today.
Until next time. Dodadagohvi.