Alexander Schäfer

(TU Kaiserslautern)
hosted by PhD Program in CS @ TU KL

"Improving Interactions for Immersive Environments with Novel Hand Gesture Authoring Tools"

Augmented (AR), virtual (VR) and mixed reality (MR) are on their way into everyday life. The recent emergence of consumer-friendly hardware to access this technology has greatly benefited the community. Be it in the field of medicine, sports, cultural heritage, remote work, entertainment, gaming, and many other examples where research and application examples for AR, VR and MR can be found. Although the technology is around for decades, immersive applications with these technologies are still in its infancy. As manufacturers increase accessibility to these technologies by introducing consumer grade hardware with natural input modalities such as eye gaze or hand tracking, new opportunities but also problems and challenges arise. Researchers strive to develop and investigate new techniques for dynamic content creation or novel interaction techniques. It has yet to be found out which interactions can be made intuitively by users. A major issue is that the possibilities for easy prototyping and rapid testing of new interaction techniques are limited and largely unexplored. In this thesis, various solutions are proposed with the aim to help researchers and developers creating novel applications with minimal effort. First, a survey which explores one of the largest and most promising application scenario for AR, VR and MR, namely remote collaboration is introduced. Based on the results of this initial survey, this thesis focuses on several important issues to be addressed when developing and creating applications. At its core, the thesis is about rapid prototyping based on panorama images and the use of hand gestures for essential interactions. Therefore, a technique to create immersive applications with panorama based virtual environments including hand gestures is introduced. Next, the potential of hand gestures is investigated with a technique to capture and recognize static gestures and use them for locomotion tasks in VR. Additionally, it is investigated how laypeople can adapt to the use of hand tracking technology in this context. In the end, a framework to rapidly design, prototype, implement, and create arbitrary one-handed gestures is presented. Based on multiple user studies, the potential of the framework as well as efficiency and usability of hand gestures is investigated.


Time: Monday, 20.06.2022, 16:00
Place: https://uni-kl-de.zoom.us/j/61420355262?pwd=NHdOUHdxeEl4RjRhU1YrdkZudnI4QT09

Termin als iCAL Datei downloaden und in den Kalender importieren.