You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

10.2 Gaze-based and eye-tracking interactions

3 min readaugust 7, 2024

Gaze-based interactions and eye-tracking are game-changers in AR/VR. They let you control things just by looking, making everything feel more natural and immersive. It's like your eyes become the mouse cursor, opening up a whole new world of possibilities.

These techniques aren't just cool - they're super practical too. They can help boost performance by only rendering high-quality graphics where you're actually looking, saving precious processing power. It's like your device reads your mind, giving you exactly what you need, when you need it.

Eye Tracking Fundamentals

Eye Tracking Technology

Top images from around the web for Eye Tracking Technology
Top images from around the web for Eye Tracking Technology
  • Eye tracking measures and records eye movements to determine where a person is looking
  • Involves using specialized hardware (eye trackers) and software algorithms to detect and track the position and movement of the eyes
  • Enables the analysis of , , and eye behavior in real-time or post-hoc
  • Applications span various fields including human-computer interaction, psychology, marketing research, and medical diagnostics

Gaze Estimation Techniques

  • Gaze estimation determines the direction and point of gaze based on the position and orientation of the eyes
  • is a common method that identifies the location and size of the pupil in the eye image
    • Utilizes computer vision algorithms to detect the dark circular region of the pupil
    • Challenges include variations in lighting conditions, eye color, and occlusions (eyelashes, glasses)
  • uses infrared light sources to create reflections on the cornea (glints)
    • Analyzes the relative positions of the pupil center and glints to estimate gaze direction
    • Provides higher accuracy and robustness compared to pupil-only methods

Eye-Tracking Calibration Process

  • Eye-tracking is a procedure to establish a mapping between the eye tracker's coordinate system and the user's gaze coordinates
  • Involves displaying calibration targets at known positions on the screen and recording the corresponding eye positions
  • User is instructed to fixate on each target while the eye tracker captures eye data
  • Calibration data is used to calculate a transformation matrix that converts eye tracker coordinates to gaze coordinates
  • Ensures accurate and reliable gaze tracking by accounting for individual differences in eye geometry and positioning

Gaze-based Interaction

Dwell Time and Gaze-Contingent Displays

  • refers to the duration of a fixation on a specific area of interest
    • Used as a trigger for gaze-based interactions, such as selections or activations
    • Dwell time thresholds can be adjusted based on the application and user preferences
  • adapt the content or functionality based on the user's gaze location
    • Examples include gaze-based menu navigation, gaze-controlled scrolling, or gaze-driven level of detail rendering
    • Enhances user experience by reducing the need for manual input and providing intuitive interaction

Eye Movement Metrics

  • are periods of relatively stable gaze on a specific point or region
    • Indicate areas of visual attention and cognitive processing
    • Fixation duration and frequency provide insights into the user's interest, engagement, and cognitive load
  • are rapid eye movements between fixations
    • Occur when the eyes move from one point of interest to another
    • Characterized by high velocity and short duration (typically 30-80 milliseconds)
    • Saccade amplitude and direction can reveal patterns of visual exploration and search strategies

Performance Optimization

Foveated Rendering Technique

  • Foveated rendering is a technique that adapts the rendering quality based on the user's gaze
  • Exploits the fact that human visual acuity is highest in the central region of the retina (fovea) and decreases towards the periphery
  • Renders the area around the user's gaze point in high detail while progressively reducing the quality in the peripheral regions
  • Reduces computational load and improves rendering performance by allocating more resources to the foveated region
  • Enables higher frame rates, lower latency, and improved visual quality in gaze-contingent applications (virtual reality, gaming)
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary