Call: 4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)

Call for Papers

4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
in conjunction with UbiComp 2014

You are cordially invited to submit original work at the PETMEI 2014 Workshop. The workshop will be held in Seattle on September 13th, 2014.

Location: Seattle, United States
Date: September 13th, 2014

IMPORTANT DATES

  • Abstract Submission: June 3, 2014
  • Paper Submission: June 10, 2014
  • Notification of Acceptance: June 24, 2014
  • Camera-ready due: July 1, 2014
  • Workshop: September 13, 2014

VISION AND GOALS

Despite considerable advances over the last decades, previous work on eye tracking and eye-based human-computer interfaces mainly developed use of the eyes in traditional desktop settings. Latest developments in remote and headmounted eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that will become pervasively usable in everyday life. With the growth of interest in smart glass devices and low-cost eye trackers, gaze-based techniques for mobile computing is becoming increasingly important in recent years. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7.

The potential applications for the ability to track and analyse eye movements anywhere and any time call for new research to further develop and understand visual behaviour and eyebased interaction in daily life settings. PETMEI 2014 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, egocentric computer vision and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking.

TOPICS

Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:

Methods

We invite participants to reflect on the specific characteristics of pervasive eye tracking systems and to contrast them with classical methods for eye tracking, eye movement analysis, eye-based interaction, and evaluation. We welcome contributions reporting on methodological advances on all components of mobile eye tracking systems, and the workshop will also cover latest technological advances in mobile eye tracking equipment.

  • Eye tracking technologies for mobile devices
  • Tools for face, eye detection and tracking
  • Gaze and eye movement analysis methods
  • Integration of pervasive eye tracking and context-aware computing
  • Multi-modal sensor fusion
  • User studies on pervasive eye tracking
  • Devices for portable, wearable and ambient eye tracking

Applications

In addition to contributions reporting on methodological advances we also want to attract submissions that explore innovative applications of pervasive eye tracking and mobile eye-based interaction. We also want to particularly invite presentations on research on egocentric vision systems and gaze related computer vision applications that can potentially extend the possibility of current mobile gaze interaction.

  • Pervasive eye-based interaction
  • Mobile attentive user interfaces
  • Eye-based activity and context recognition
  • Security and privacy for pervasive eye-tracking systems
  • Eye tracking for specialized application areas
  • Eye-based human-robot and human-agent interaction
  • Cognition-aware systems and user interfaces
  • Human factors in mobile eye-based interaction
  • Egocentric computer-vision systems and applications

SUBMISSION GUIDELINES

Prospective authors should submit papers with a length of 6-12 pages in the SIGCHI non-archival (Extended Abstracts) format. In addition to research papers we explicitly invite submissions of position papers and papers that describe preliminary results or work-in-progress. Manuscripts will be reviewed by at least two reviewers. Accepted papers will be published in the UbiComp 2014 adjunct proceedings.

Templates

The format for submissions has changed, please use the SIGCHI Extended Abstracts template:

-Latex http://www.sigchi.org/publications/chipubform/sigchi-extended-abstracts-latex-template/view

-Word http://www.sigchi.org/publications/chipubform/sigchi-extended-abstracts-word-template/view

Submission Website

Please visit our website http://2014.petmei.org/submissions/ for regular updates.

ORGANIZERS

  • Thies Pfeiffer, Center of Excellence Cognitive Interaction Technology, Bielefeld University, Germany
  • Sophie Stellmach, Microsoft Corporation, USA
  • Yusuke Sugano, The University of Tokyo, JP

PROGRAM COMMITTEE

PETMEI 2014 is supported by the following program committee members:

  • Andreas Bulling, Max Planck Institute for Informatics, DE
  • Andrew T. Duchowski, Clemson University, USA
  • Alireza Fathi, Stanford University, USA
  • Dan Witzner Hansen, IT University of Copenhagen, DK
  • Kris M. Kitani, Carnegie Mellon University, USA
  • Päivi Majaranta, University of Tampere, FI
  • Lucas Paletta, Joanneum, AT
  • Pernilla Qvarfordt, FX Palo Alto Laboratory, US
  • Lech Swirski, University of Cambridge, UK
  • Takumi Toyama, DFKI, DE

CONTACT AND FURTHER INFORMATION

For further information, please visit our website or sent us an email:

This entry was posted in Calls. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z