Meta Eye Tracking
Last week, I shared my initial thoughts on the Apple Vision Pro announcement.
I highlighted the use of eye and hand tracking as a revolutionary human interface for Virtual Reality / Augmented Reality, delivering a fast and intuitive way of navigating the operating system and applications.
This led me to think about the Meta Quest Pro, which also includes the required hardware for eye and hand tracking.
Hand tracking is available as part of the Meta operating system and is enabled in some applications. However, up to now, I have only seen the eye tracking capabilities used for foveated rendering (not navigation) to help manage the performance impact of demanding games.
This feels like a huge miss from Meta, but also a huge opportunity!
It is also a little surprising that Meta has not chosen to showcase eye tracking more prominently across their first-party software, recognising it is referenced as a unique feature within their documentation.
“If you choose to turn on eye tracking, your headset will analyse images of your eyes to produce a set of numbers that represent estimates of where you’re looking in VR.”
Thankfully, I was not alone in this thinking.
Earlier today, “ThrillSeeker” released a proof of concept application, replicating the Apple visionOS user interface using the Meta Quest Pro, with eye and hand tracking enabled. In addition, he released a video (embedded below) highlighting the development process, accomplished in just two days using Unity.
ThrillSeeker was not the first to build an eye tracking application for the Meta Quest Pro, with examples (Dilmer Valecillios, Fabio914) dating back six months. However, he is the first to replicate the user experience presented by Apple at WWDC 2023, providing a direct comparison.
The application (Thrills Eyetracked Playground) can be downloaded and installed via SideQuest.
This proof of concept, alongside the many I expect to see in the coming weeks/months, validates my suspicion that the combination of eye and hand tracking could be primed to become the default human interface for Virtual Reality / Augmented Reality.
Similar to what they achieved with multi-touch on the iPhone, I credit Apple for taking an existing concept and leveraging their vertical integration (hardware + software) to deliver an experience that pushes the market forward.
I hope Meta takes inspiration from Apple and their community, enabling first-party support for eye and hand tracking across the operating system and their first-party applications.
If nothing else, considering the Meta Quest Pro is $999 ($2501 cheaper than the Apple Vision Pro) and has immediate availability, it could offer a nice public relations boost to shift some of the media/consumer attention away from Apple.
Finally, it begs the question regarding the upcoming Meta Quest 3, which is a lower-cost ($499) headset that unfortunately does not include the required hardware for eye tracking. In hindsight, this omission may have been a mistake but opens the door to a potential Meta Quest 3+, potentially released in time for the holiday season (and still before the Apple Vision Pro).
I would certainly be open to spending an additional $100 for eye and hand tracking, assuming tight software integration.