Today, virtual reality headsets can’t account for differences in vision, which can make watching VR less enjoyable for some people, even causing headaches or nausea.
Now, writes Vignesh Ramachandran in Stanford News, researchers at the university's Computational Imaging Lab are developing VR headsets that can adapt their displays for factors like eyesight and age that affect how we actually see.
The work is still in its prototype stage, but the research shows how VR headsets could one day offer the sort of personalization that users have come to expect from other technologies such as voice recognition.
“Every person needs a different optical mode to get the best possible experience in VR,” said Gordon Wetzstein, assistant professor of electrical engineering and senior author of research published in Proceedings of the National Academy of Sciences.
Read the full story: