Today Emily Cooper, Psychological and Brain Sciences Department at Dartmouth College, gave a talk on designing and assessing near-eye displays to increase user inclusivity. A near-eye display is a wearable display, for example, an augmented reality (AR) or a virtual reality (VR) display.
With most near-eye displays it is not possible or recommended to wear glasses. Some displays, like the HTV Vive, have available lenses to correct the accommodation. We do want to integrate flexible correction into near-eye displays. This can be achieved with a liquid polymer lens with a membrane that can be tuned.
In her lab, for the refraction self-test, the presenter uses an EyeNetra auto-refractometer, which is controlled with a smartphone.
The near-eye display correction is as good as with contact lenses, both in sharpness and in fusion correction. Therefore, it is not necessary to make users wear their correction glasses.
There are two factors determining the image quality of a near-eye display: accommodation and vergence. The problems with incorrect vergence are that users get tired after 20 minutes and the reaction time is slower when the vergence is incorrect.
The solution is to use tunable optics to match the user's visual shortcomings.
A different problem is presbyopia, which is a range reduction. For people older than 45 years, an uncorrected stereo display provides better image quality than correcting the accommodation. However, tunable optics provide better vergence for older people.
A harder problem are people with low vision, regardless of their age. In her lab, Emily Cooper investigated whether consumer-grade augmented reality displays are good enough to help users with low vision.
She used the HoloLens, in which the depth camera in the NIR domain is the key feature to address this problem. Her proposal is to overlay the depth information as a luminance map over the image so that near objects are light and far objects are dark. This allows the users to get by with their residual vision.
Instead of a luminance overlay, a color overlay also works. In this approach, the hue is changed on a segment from warm to cold colors in dependence of their distance. She also tried to encode depth with flicker but is does not work well.
With the HoloLens, it is possible to integrate OCR in the near-eye display and then read all text in the field of view using the 4 speakers in the HoloLens, making the sound come from the location where the text is written.
No comments:
Post a Comment