Now THAT is amazing, but not out of the question either. Why can’t you do eye tracking with an Oculus Rift? How much more hardware and extra calibration steps would you need to do data collection like this? Seems like a great value add for folks doing Brain and Cognitive Science research with 3D test rigs. Maybe this will open up a market for Oculus Rift as research device.
There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.
Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.
To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.
View original post 60 more words