Interaction in VR

Gaze + Pinch in Action


The accommodating paper co-written with Ken Pfeuffer was published to the ACM Digital Library for the 5th ACM Symposium on Spatial User Interaction (SUI 2017).


You can find more publications of mine on Google Scholar. There is also a full talk about the Gaze+Pinch paper given by my colleague Ken Pfeuffer at SUI 2017.

Basic 3D interaction with gaze + pinch to manipulate cubes in a VR setting.

Basic 3D interaction with gaze + pinch.

Eye-Tracking and Gestures in VR


In my bachelor thesis we proposed a new interaction technique integrating manual interaction and eye-tracking into VR, called gaze + pinch.


Non-VR 3D interaction techniques have had many recent approaches using manual and gaze-based input which we propose could improve interaction in VR as well.


Manual input appears particularly promising to solve problems of intuitiveness and immersion but lacks the ability to easily interact with remote objects, which eye-tracking tries to deal with.

Immersive and Intuitive Interaction


The current state of the art in VR interaction is controllers like the ones of the HTC Vive shown here. But introducing foreign objects into a VR scene can break immersion, the most important property of any VR application.


This is especially true for applications where the user wouldn’t expect to hold something in their hands, like a social or climbing application.

HTC Vive headset and controllers. Image from ETC-USC on flickr

HTC Vive headset and controllers. Image from ETC-USC on flickr.

A 3D living room scene, showing eye gaze + hand pinching interaction in VR. The image shows a picture-in-picture view of the user in the top right.

In a 3D living room scene, the user can rearrange their furniture by using gaze + pinch.

Expanding the Concept


Thus, we introduced a new concept combining the advantages of gesture interaction with those of eye tracking.


One example application we built is a living room scene, in which the user can use gaze + pinch to arrange their furniture to their liking.


In this scene, we demonstrate how one can easily use eye gaze for selection (indicated by the grey circle) in combination with gesture interaction for positioning.