7 DECEMBER - 11AM

The recent emergence of eye tracking for XR headsets, especially for VR, opens up new avenues for scientific research & development and for commercial applications & services. This new capability is either an integral part of advanced gears or is provided via add-on inserts.

This one-of-a-kind Thematic Conference gives you a panorama of current, future, and blue-sky ideas for the use of eye-tracking, and it emphasizes that a camera looking at the eyes enables, not only to determine the direction of regard, but also to access, to some degree, the user’s brain, emotions, and psyche. Among others, this conference discusses the application of eye tracking and analysis to merchandising in stores, and to nonlinear, stretchy storytelling in entertainment, including 3D cinematic VR..

 International companies may find it strategically useful to sponsor/adopt this innovative, straight-to-the-point conference.

 

 

Sponsoring

 International companies may find it strategically useful to sponsor/adopt this innovative, straight-to-the-point conference.

If this topic is of interest to you, you can support it by adopting it:
Adopt this thematic conference
For more information, please contact Alain Gallez.

Program

7 DECEMBER - 11AM

With a nominal duration of 1.5 hour, and with presentations and a panel by a handful of world experts, this short & dynamic professional conference will efficiently bring you up to speed on this theme and its opportunities.

Chairpersons: Victor Fajnzylber (University of Chile, Chile), David Grogna (University of Liège, Belgium), & Jacques G. Verly (University of Liège & Stereopsia, Belgium)    

Presentations

- Eye-tracking studies for understanding immersion - Victor Fajnzylber (University of Chile, Chile)
- Results of studies in eye tracking - Samuel Madariaga (University of Chile, Chile)
- Head-eye tracking and smart glasses - Gerrit Spaas (Trivisio, Luxembourg)
- Modeling the eye tracking gaze trajectories as an information channel - Mateu Sbert (Tianjin University, China; Girona University, Spain)
- What can one extract in real-time from images of the eye area to characterize a person's physiological & cognitive states, with application to XR? - Clémentine François (Phasya, Belgium)
- Active visual cognition in XR: lessons from cinema and real-world eye movements - Tim J. Smith (University of London, UK)

Panel discussion and Q&As

WIth all speakers above

Important: All information above is subject to change at any time. Please, stay tuned.

Special thanks to our sponsors and partners