Design and Evaluation of User-Centric Gestures for Locomotion in HMD-VR Interfaces for a Seated Position
No Thumbnail Available
Virtual Reality (VR) systems offer a means of immersing users within 3D computer-generated virtual environments (VEs) with the aim of creating a natural experience. This thesis explores and investigates body gestures as a medium of interaction in a seated position for locomotion in multitasking VEs. The present research aims to identify and evaluate natural and intuitive gestures for locomotion in a seated position for three different multitasking scenarios This research presents the results of five user studies aimed at developing and evaluating three new controller-less, gesture-based locomotion techniques for multitasking VEs that can be performed while seated. The first study utilized the gesture elicitation method to generate three sets of user-centric gestures for locomotion suitable for different multitasking VEs. The second study evaluated these gestures based on appropriateness, ease of use, effort, and user preference and further classified them based on hand usage and geometric taxonomy. Three new gesture-based techniques, called the Calling gesture, Deictic Pointing gesture, and Mirror- Leaning gesture, was designed and developed for virtual locomotion. Three different comparative studies were conducted to compare these gestures with techniques from the literature, such as tapping and teleportation, and evaluated based on parameters such as task completion time, accuracy, intuitiveness, performance, comfort, ease of use, perceived workload, spatial knowledge, presence, simulation sickness, and user preference.
Supervisor: Sorathia, Keyur Babulal
HCI, Virtual Reality, Gesture Based Interaction Techniques