The augmented reality glasses use three-dimensional cameras that detect the structure and position of nearby objects, Euronews.com reports.
Software then blocks out the background and highlights only what is nearest to the user.
Dr. Stephen Hicks, of the Nuffield Department of Clinical Neurosciences at Oxford University, is leading the research. Hicks says depth perception is a unique facet of the smart-glasses technology.
Read the full story: