Head and Body Pose Estimation using Ultrasound Sensing on Smart Glasses

Published in Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA), 2026

Recommended citation: Ke Li, Robin Kips, Filippo Arcadu, Marek Schikora, John Ho, Chris Graf, Jim Chen, Xingxing Cai, Gawsalyan Sivapalan, Ariyan Zarei, and Yan Deblangey. 2026. Head and Body Pose Estimation using Ultrasound Sensing on Smart Glasses. In Extended Abstracts of the 2026 CHI Conference on Human Factors in Computing Systems (CHI EA). ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3772363.3798385

Presentation video

April 13–17, 2026, Barcelona, Spain
Keyword: Ultrasound Sensing (USS), Body Pose Estimation, Head Pose Estimation, Smart Glasses

Trulli

Tracking full body pose enables applications in avatar animation, virtual gaming, fitness, and hands-free digital interaction. In this paper, we present a novel technique using ultrasound sensors on smart glasses to continuously and accurately track users’ head and body pose. Ultrasound sensors are ideal for smart glasses due to their small size (a few millimeters), light weight (tens of milligrams), and low power consumption (< 1 mW), outperforming cameras in these aspects. Unlike IMUs, our approach enables full body tracking without extra body-mounted sensors. We validated our method in a user study with 13 participants, achieving a Mean Per Joint Position Error (MPJPE) of 7.29 cm and a Mean Per Joint Rotation Error (MPJRE) of 17.44° for 24 SMPL body joints using user-independent models. For head tracking relative to the body, our technique reached a position error of 4.59 cm and an angular error of 13.47°.