Hand Tracking
xtan explores hand tracking systems where stereo vision, spatial tracking, and geometry-aware sensing support gesture interaction, spatial input systems, and motion-aware interfaces.
Tracking hand movement in 3D space
Hand tracking systems detect and analyze hand motion within spatial environments. These systems are used in XR interaction, gesture interfaces, robotics control, and experimental human-computer interaction workflows.
Potential for spatial interaction systems
Many interaction systems explore replacing traditional input devices with natural hand motion. Geometry-aware sensing may support experimental workflows where spatial hand movement becomes an input method for digital tools, interfaces, and interactive environments.
Why xtan can be relevant
xtan focuses on stereo vision, geometry-first interaction, and practical spatial systems. Within hand tracking environments this may support gesture-driven interfaces, motion-aware applications, and experimental spatial interaction systems.