Gaze mapping for mobile video-based eye-tracking

Are you passionate about bridging the gap between AI and real-world applications? Join us at the Lincoln Centre for Autonomous Systems (L-CAS) to work on an exciting project that utilizes computer vision techniques to enable advanced visual attention analysis of mobile eye-tracking footage.

In this project, you are going to develop a computer vision-based system that maps the gaze location along a sequence of images extracted from eye-tracking footage recorded with a mobile device (glasses) in a dynamic scene. The gaze mapping problem can be solved in a 2D (based only on image matching) or 3D (using SLAM or structure from motion) representation. In either case, the final goal is to generate a heatmap representation of the evolution of gaze positions within a specific time window.

You are expected to test the performance of your proposed system in footage from eye-tracking glasses worn by strawberry pickers working in controlled and real conditions.

Required Skills

  • Strong programming skills in Python
  • Good understanding of computer vision techniques
  • Good understanding of Artificial Neural Networks
  • Familiarity with Docker and GitHub tools
  • Excellent communication abilities

Desirable Skills

  • Familiarity with SLAM and/or structure from motion algorithms
  • Familiarity with image matching algorithms

What We Offer

  • Hands-on experience with cutting-edge AI technologies
  • Integration into the dynamic L-CAS research team
  • Possibility to participate in scientific writing and co-authoring a research paper

This is an internship position suitable for students pursuing a programme of study in computer science, robotics and/or AI at the University of Lincoln. If you are interested, fill out our Expression of Interest Form, choosing Dr Leonardo Guevara (lguevara@lincoln.ac.uk) as the researcher to supervise the project.