Depth Perception

When you create a mixed reality environment depth cues give the user a clue on where they are placed within this reality. In your real environment your hand can't move trough a button. Your sense of touch allows you, together with other senses, to realize the position of objects in your environment. If these, so called depth cues, get mixed up the user gets confused and can experience eye strain or headaches. (1)

One depth cue is occlusion and refers to a depth order in wich for example one object partiall obstructs the view of another.(2)

Convergence

Another for important depth cue for strong spatial vision and eye ergonomics is Convergence. As Joe Ludwig describes it "how far you how far your eyes have to physically rotate to focus on the object in question. For near objects they rotate more than they do for farther away objects, relative to being parallel." (3)

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/0428cf9b-092b-41da-9570-55d6e2276bca/Bildschirmfoto_2021-05-24_um_16.11.28.png

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/a8722d5d-3065-407c-be82-d91b4ead72ff/Bildschirmfoto_2021-05-24_um_16.12.26.png

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/b2c69edb-530f-4c0e-9c2a-ccfae2424f09/Bildschirmfoto_2021-05-24_um_16.12.42.png

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/7176784a-d6bf-4563-86db-4cc8b09bdb22/Bildschirmfoto_2021-05-24_um_16.14.48.png

Alex Chu found that in a mixed reality differently sized spheres that are placed at different distances might appear as if they have the same size on a single screen. Meanwhile the pictures on the left and right display of a HMD are less horizontally displaced the further the sphere is from the camera. Measuring the displacement of the spheres center to the center of the sphere and comparing it to the with of the display allowed him to define different sections spatial vision. According to him UI Elements, within 0-10 Meters of the camera, that occupy a large amount of your Field of View for a long time, can cause Eye Strain. No Elements should be visible within the Area of 0-1m from the camera (4)

Spatial Vision Segments

Touch Area

Since we are placed in a MR environment 360 degree around us are available for interaction. In order to define wich range of motion the human head has to interact with UI while moving Alex Chu evaluated wich rotations from 0 are at a, self defined, comfortable and maximum angle. (5)

Comfortable Range Test

By subtracting the crosseyed area from the comfortable range area of you head and intersection it with the motion range of your hands we can define an individual touch area for each user.

UI Area

By combining the results from the convergence test and the comfortable range are the following UI area can be defined (6)

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/ad85aa61-7fba-4e0f-9502-12868b18ba0a/Bildschirmfoto_2021-05-24_um_16.22.00.png

Sources

(1) Joe Ludwig, Lessons learned porting Team Fortress 2 to Virtual Reality, Page 19, http://media.steampowered.com/apps/valve/2013/Team_Fortress_in_VR_GDC.pdf