Markerless 3D Interaction in an Unconstrained Handheld Mixed Reality Setup

D. Fritz, A. Mossel, H. Kaufmann:
"Markerless 3D Interaction in an Unconstrained Handheld Mixed Reality Setup";
The International Journal of Virtual Reality (eingeladen),15(2015), 1; S. 25 - 34.

[ Publication Database ]

Abstract:


In mobile applications, it is crucial to provide intuitive means for
2D and 3D interaction. A large number of techniques exist to
support a natural user interface (NUI) by detecting the userĀ“s hand
posture in RGB+D (depth) data. Depending on the given
interaction scenario and its environmental properties, each
technique has its advantages and disadvantages regarding
accuracy and the robustness of posture detection. While the
interaction environment in a desktop setup can be constrained to
meet certain requirements, a handheld scenario has to deal with
varying environmental conditions. To evaluate the performance of
techniques on a mobile device, a powerful software framework
was developed that is capable of processing and fusing RGB and
depth data directly on a handheld device. Using this framework,
five existing hand posture recognition techniques were integrated
and systematically evaluated by comparing their accuracy under
varying illumination and background. Overall results reveal best
recognition rate of posture detection for combined RGB+D data at
the expense of update rate. To support users in choosing the appropriate technique for their specific mobile interaction task, we derived guidelines based on our study. In the last step, an experimental study was conducted using the detected hand
postures to perform the canonical 3D interaction tasks selection and positioning in a mixed reality handheld setup.