Technical Demos of SAGA 2013

During the workshop, several technical demonstrations of systems for automatic or semi-automatic gaze analysis will be given.

Demonstrations of Research Projects

InSight Out: Automatic object recognition and person detection for mobile eye-tracking data analysis

Presenters: Stijn De Beugher, Geert Brône, Toon Goedemé (ESAT, EAVISE KU LEUVEN)
Project Webpage:

EyeSee3D – A low-cost approach to gaze analysis in natural environments

EyeSee3D is a solution for gaze analysis based on a proxy geometry model. The approach significantly reduces the annotation effort. It can be applied to small setups, such as table-top interactions, as well as to larger installations. EyeSee3D makes use of Augmented Reality technology to provide gaze analysis in real-time on a mobile setup.

INKA-SUITE: An integrated test-environment for analyzing chat communication

Presenters: Philipp Schlieker-Steens (University of Applied Sciences and Arts Dortmund)
Project Webpage: INKA Homepage

Automatic gaze analysis is not only a hot topic in mobile eye tracking, but also in the design of software interfaces in the area of information science. Software interfaces are highly dynamic and a per-screen analysis is often not adequate. The INKA-Suite is a rapid-prototyping framework for the evaluation of software solutions for cooperative work which includes eye tracking on dynamic AOIs that are automatically generated for INKA-GUI elements.

Location-based Online Identification of Objects in the Centre of Visual Attention using Eye Tracking

Presenters: Kai Harmening, Thies Pfeiffer (SFB 673, Artificial Intelligence Group)

A computer-vision based approach to fixation classification. The content in the fixated area is matched against a database to identify the object of interest. Information about the location of the person being tracked and the topology of the environment is used to make tracking more robust and efficient. The approach is suitable for gaze analysis on shelfs but also supports moving objects.

Mobile Eye Tracking paired with a mobile EEG solution

Presenters: Andrea Finke (CITEC)

Andrea Finke demonstrates how eye tracking can be combined with the EMOTIV EEG system.

Object recognition and tracking solutions from the CITEC Robotics Group

Presenter: Andre Ückermann (SFB 673, CITEC)

We present a real-time algorithm that segments unstructured and highly cluttered scenes. The algorithm robustly separates objects of unknown shape in congested scenes of stacked and partially occluded objects. The model-free approach finds smooth surface patches, using a depth image from a Kinect camera, which are subsequently combined to form highly probable object hypotheses. The existing deictic gesture selection of objects in the scene can be replaced by gaze-based approaches.

Videos of real-time segmentation:
Videos of deictic gesture selection:

Commercial Demonstrations


Presenters: Ingmar Gutberlet of Sensomotoric Instruments (SMI)

SMI presents their gaze-analysis solution BeGaze.

EyeTracking paired with CAPTIV physiological wireless sensors

Presenters: Stephane Folley of TEA

TEA presentes eyetracking combined with wearable sensors for offline analysis, e.g. in the context of ergonomics.

– A Modular Approach Towards Automatic Annotation of Gaze Videos

Presenters: Mediablix IIT GmbH

Mediablix presents a solution for video analysis of mobile eye tracking videos.