Program

The workshop was hosted at the CITEC building (Inspiration 1/33619 Bielefeld). See Location on how to get there.

Tuesday, September 29th, 2015

08:30 – Foyer – Registration desk opens / Coffee
09:00 – CITEC Auditorium – Welcome
09:15 – CITEC Auditorium – Keynote by Jacob Lund Orquin: What eye tracking researchers (dis)agree about reporting
10:15 – Foyer – Coffee & Cake Break
10:45 – CITEC Auditorium – Session 01: Automatic Analysis

12:00 – Foyer – Lunch Break

13:30 – CITEC Auditorium – Session 02: Human Computer Interaction
14:30 – Foyer & Labs – Poster & Demo Session + Coffee Break
16:00 – CITEC Auditorium – Keynote by Mark Williams: Visual search behaviour and expertise in high-performance environments
17:00 – Foyer – Coffee Break
17:15 – CITEC Auditorium – Session 03: Visual Expertise & Motor Performance

19:30 – Social Event

Wednesday, September 30th, 2015

09:15 – CITEC Auditorium – Keynote by Maria Staudte: Studying gaze in spoken interaction
10:15 – Foyer – Coffee & Cake Break
10:45 – CITEC Auditorium – Session 04: Methods for Measuring & Quantification

12:00 – Foyer – Lunch Break

13:30 – CITEC Auditorium – Session 05: Language & Memory
14:20 – Foyer – Coffee Break
14:30 – CITEC Auditorium – Session 06: Real World Studies & Applications

14:55 – CITEC Auditorium – Closing Session

20:00 – Optional social event for guests leaving the next day

Sessions

Session 01: Automatic Analysis

Presentation time: 20 minutes plus 5 minutes discussion

  • GazeVideoAnalyser: A Modular Software Approach Towards Automatic Annotation of Gaze Videos by Kai Essig, Dato Abashidze, Manjunath Prasad and Thomas Schack
  • Semi-automatic annotation of eye-tracking recordings in terms of human torso, face and hands by Stijn De Beugher, Geert Brône and Toon Goedemé
  • Capturing and Visualizing Eye Movements in 3D Environments by Thies Pfeiffer, Cem Memili and Patrick Renner

Session 02: Human-Computer-Interaction

Presentation time: 15 minutes plus 5 minutes discussion

  • Toward implicit human-machine interaction: Single-trial classification of fixation-related potentials by Andrea Finke, Kai Essig, Giuseppe Marchioro and Helge Ritter
  • Online Visual Attention Monitoring for Mobile Assistive Systems by Patrick Renner and Thies Pfeiffer
  • Gaze Tracking for Human Robot Interaction by Oskar Palinko, Francesco Rea, Giulio Sandini and Alessandra Sciutti

Session 03: Visual Expertise & Motor Performance

Presentation time: 20 minutes plus 5 minutes discussion

  • Maintenance of perceptual-cognitive expertise in volleyball under different time constraints by Lennart Fischer, Joseph Baker, Judith Tirp, Rebecca Rienhoff, Bernd Strauss and Jörg Schorer
  • Into the wild – Musical communication in ensemble playing. Discerning mutual and solitary gaze events in musical duos using mobile eye tracking by Sarah Vandemoortele, Stijn De Beugher, Geert Brône, Kurt Feyaerts, Toon Goedemé, Thomas De Baets and Stijn Vervliet

Session 04: Methods for Measuring & Quantification

Presentation time: 20 minutes plus 5 minutes discussion

  • Analyzing Patterns of Eye Movements in Social Interactions by Nadine Pfeiffer-Leßmann, Patrick Renner and Thies Pfeiffer
  • A Metric to Quantify Shared Visual Attention in Two-Person Teams by Patrick Gontar and Jeffrey B. Mulligan
  • The impact of image size on eye movement parameters by Ricardo Ramos Gameiro, Kai Kaspar, Sontje Nordholt and Peter König

Session 05: Language & Memory

Presentation time: 20 minutes plus 5 minutes discussion

  • Language and gaze cues: findings from the real world and the lab by Ross Macdonald, Maria Staudte and Benjamin Tatler
  • Collecting memories: the impact of active object handling on recall and search times by Dejan Draschkow and Melissa L.-H. Võ

Session 06: Real World Studies & Applications

Presentation time: 20 minutes plus 5 minutes discussion

  • Gaze Analysis in Mobile Pedestrian Navigation: Culture Affects Wayfinding Styles by Lucas Paletta, Stephanie Schwarz, Jan Bobeth, Michael Schwarz and Manfred Tscheligi
  • Effect of Familiarity on Visual Attention and Choice at the Point of Purchase by Kerstin Gidlöf, Martin Lingonblad and Annika Wallin

Posters

The poster exhibition takes place in the corridors around the virtual reality lab and the robotics lab.

  • Gaze interaction for multi-display systems using natural light eye-tracking by Onur Ferhat, Arcadi Llaza and Fernando Vilariño
  • Towards Using Eyetracking Data as Basis for Conversation Analysis on Real-World Museum Interactions by Raphaela Gehle, Antje Amrheim, Maximilian Krug, and Karola Pitsch
  • Online Calibration of Gaze Tracking on the Computer Screen using Particle Filtering by Marc Halbrügge
  • Automated Comparison of Scanpaths in Dynamic Scenes by Thomas C. Kübler and Enkelejda Kasneci
  • The influence of spatial occlusion on visual search behavior of karate athletes by Simon Salb, Markus Splitt, Nicole Bandow, and Kerstin Witte
  • Capturing gaze behavior patterns of surfers during surfboard riding: A pre-study in testing a water housing system for mobile eye tracking technology by Martin Walz, Paul Günther, and Guido Ellert
  • Exploring the Relationship Between Motor Imagery, Action Observation, and Action Execution in Motor Skill Learning by Alessio D’Aquino, Cornelia Frank, Kai Essig and Thomas Schack

Technical Demonstrations

  • A Virtual Supermarket for Gaze-based Interaction Studies by Thies Pfeiffer and Patrick Renner – Virtual Reality Lab
  • GazeTk: Gaze-based Interaction with the WWW by Thies Pfeiffer, Dimitri Heil, and Patrick Renner – Humanoids Lab
  • EyeSee3D: Real-time Gaze Analysis for Interactions in a 3D World by Thies Pfeiffer and Patrick Renner – Virtual Reality Lab
  • KogniChef – The smart kitchen by Alex Neumann and Andre Ückermann – Robotics Lab
  • The Art of Gaze: Beyond Pointillism by Dennis Wobrock and Andrea Finke – Foyer during Wednesday Coffee Breaks
  • Audible Eyed: Intelligent Chess Tutor by Mohd Atif, Thoren Huppke, Evangelos Mitsis, Christian Witte, Antonis Zaroulas, Kai Essig, and Thies Pfeiffer – Humanoids Lab
  • GazeVideoAnalyzer – A Modular Approach Towards Automatic Annotation of Gaze Videos using Various Object Recognizers by Kai Essig – Humanoids Lab