EyeSight is the standout feature of Apple Vision Pro. It creates an artificial transparency on the surface of the headset that allows for eye contact. But how does it work?
People who see Vision Pro for the first time and are not familiar with VR technology may think that the front is made of a transparent piece of glass. In reality, just like any other VR headset, the wearer’s eyes are hidden behind an opaque layer of technology that includes displays, lenses, sensors, and chips.
In a talk show held at WWDC 2023, the VP of the Technology Development Group that created Vision Pro Mike Rockwell, elaborated on the technology behind Eyesight and its outward display.
According to Alex Heath, the idea of a front-facing display goes back to Apple’s former chief designer Jony Ive. However, using an external display to show the user’s eyes is not new. Meta showed such prototypes in 2021, but did not implement the technology in products.
How Eyesight works
The secret behind Eyesight is an outward display that closely mimics the eyes of headset users. According to Rockwell, it is not just curved, but lenticular. This means that it displays a slightly shifted image of the eyes depending on the viewing angle. A traditional 2D display would make the eyes look unnatural, especially when looking at a person from the side.
The difference can be seen in the following research from Meta. A conventional 2D display is used in the middle, while a lenticular 3D display is used on the right.
“We needed to create a separate view for anybody looking at you from any angle. So, we created a lenticular display, the very first curved lenticular display that’s ever been made. And we actually render separate views of your eyes for every person who’s looking at you,” Rockwell explains.
And where does the data for these views come from? Rockwell’s comments suggest two sources of data that are fused in the process: First, there’s the footage from the four eye tracking cameras inside the headset. Second, Apple uses the personaor digital avatar, created in advance using a 3D facial scan of the person wearing the headset.
These data sources are used to create a digital image of the eye and its surroundings with minimal latency, which is then rendered at different viewing angles.
Breaking the social isolation of VR
Due to the technical complexity of this feature (and the serious effects that could result if the technology fails), the press has not yet been able to try Eyesight during the initial hands-on with Apple Vision Pro. Apple is likely still fine-tuning it.
However, Eyesight will have a number of limitations, some experts sayexpecting a rather dim and low-resolution image et un limited amount of viewing angles due to the lenticular technology used.
Still, Apple believes the enormous technical effort is worth it. Why is that? Because Apple wants to break the isolation of VR headsets in both directions. Headset users should be able to see the environment, and the environment should be able to see the headset users.
“You can have a natural interaction with anybody as they walk in, and it’s really stounding. I mean, it’s something that I think when folks see it, people will feel like the device is transparent. That just looks like it’s transparent, and it makes it so that you don’t feel disconnected from people, and it was really a fundamental core value of what we were trying to do,” Rockwell says.
Part of Eyesight’s function is to automatically switch between a transparent and opaque view, depending on whether you’re consuming immersive content or interacting with people in your immediate environment. When someone approaches, they automatically come into view. See the video above for both scenarios.