Making Vibration-based On-body Interaction Robust [Demonstration]

NESL Technical Report #: 2022-5-2251

Authors:

Abstract: Wearable devices like smartwatches and smart wristbands have gained substantial popularity in recent years. However, due to the limited size of the touch screens, smartwatches typically have a poor interactive experience for users. Recently, new technology has converted the human body into a virtual interface using finger activity-induced vibrations. However, these solutions fail to meet expectations during real-world deployments, e.g., system performance significantly degrades due to human-based variations, such as hand shapes, tapping forces, and device positions. To mitigate these human-based variations, we collected a dataset of 114 users, built a deep-learning model, and designed a novel Siamese domain adversarial training algorithm. In this way, we implement a robust system that works at accuracy (97%) across different hand shapes, finger activity strengths, and smartwatch positions on the wrist. We have posted a demo video on YouTube (https://youtu.be/N5-ggvy2qfI).

Local downloads:

Publication Forum: Proceedings of the 13th ACM/IEEE International Conference on Cyber-Physical Systems (ICCPS 2022)

Date: 2022-05-04

Place: Milan, Italy

Publisher: ACM/IEEE

NESL Document?: Yes

Document category: Demonstration

Primary Research Area: Pervasive Computing

Back