Project: Sensory Substitution

Name: Sensory Substitution

TItle: Enabling Edge Devices to Learn from Each Other: Sensory Substitution

Description: Due to the presence of heterogeneous sensing modalities (eg. vision, audio and inertial) and different deployment scenarios, edge devices cannot use same machine learning model. However, training separate models are limited by the availability of labeled data. To address the challenge, inspired from biological sensory substitution such as touch to sight, we explore the idea of sensory substitution in edge devices. To enable sensory substitution, our approach is to learn a shared representation using unlabeled data across modalities. We exploit the fact that in the same environment, edge devices are capturing the same event. Our evaluation using human activity recognition as a use case shows that sensory substitution can reduce the required labeled data by up to 90% and can speed up the training process by up to 50 times in comparison to training edge devices from scratch.

Status: Active Project

Main Research Area: Sensor and Actuator Networks

Back