From reality is it then? virtual, it comes across as if you are right in the middle of it. Do your eyes also experience a virtual reality world as ‘real’? If you see something in the distance with VR glasses, do your eyes also focus on something far away?
With ordinary screens, your eyes focus on one distance. Whether you’re looking at a close-up or a wide landscape, your eyes are positioned to suit the distance from the screen to your eye. With VR glasses, your eyes are much more active, despite the screen being a fixed distance of a few centimeters from your eye. “The short answer is that such glasses contain lenses that ensure that you can see well into the distance,” says René van Rijn, ophthalmologist at Amsterdam UMC. “But there’s a lot more to this question than you might think at first glance.”
Sharp on your retina
First let’s talk about how eyes work. “When you look into the distance, your eyes are more or less parallel. Your lenses are relaxed and the image in the distance is sharp on your retina,” explains Van Rijn. “When you look at something close, your eyes turn slightly toward each other – that’s called convergence – and associated with that is your lenses becoming slightly more convex – that’s called accommodation.”
VR glasses have a fixed focus distance. It differs per headset, 1.3 meters, 1.5 meters or ‘infinite’ are common variants. If you try to focus on something close by, the image will become blurry. “Your eyes converge, as a game maker you can respond well to that by moving the images slightly,” says Van Rijn. “The lens of your eye also automatically starts to accommodate and that’s where it goes wrong, because the lens in the glasses that is optimal for seeing further away is still there.” This is only a problem for young people. “If you are over forty, your eyes can no longer accommodate well, which is why almost everyone needs reading glasses. Elderly people can therefore better look closer in VR.”
VR Glasses manufacturers naturally want to solve this problem and have serious research departments that deal with this vergence-accommodation conflict, as it is called in literature. For example, VR glasses maker Oculus, from parent company Meta, has been working on a new headset with for a few years now eye trackingtechnology and varifocal lenses. Thanks to eye tracking the glasses know what the user is looking at, he then adjusts the lens to the appropriate focus distance.
The first prototype contained a lens that changed position with a lot of clicking and sliding. They are now a number of versions further and there is a row of six different lenses in the headset, which can be active separately or in combination. The glasses can focus at 64 different distances instead of one.
There is one more problem that needs to be solved before your eyes experience VR image like the real world. “If you look in the real world in the distance, and something that is close comes into your field of view, you see it blurry,” says Van Rijn. “But in VR, if you’re focused on the distance, the whole image is sharp. That feels unnatural.” Just add a haze with software, you might say. But it’s not that simple, the Oculus researchers show. The haze turns out not only to be related to distance, but also to color. Based on that insight, they are now training a machine-learning algorithm that applies haze in a way that comes as close as possible to the real-world experience.
The knowledge about varifocus and depth blur is there, the technical details are still being worked on. The new headset from Oculus would be on the market in 2020, it’s not there yet.
A version of this article also appeared in NRC Handelsblad on 15 January 2022
A version of this article also appeared in NRC on the morning of January 15, 2022