The case for smartwatch-based diet monitoring

We explore the use of gesture recognition on a wrist-worn smartwatch as an enabler of an automated eating activity (and diet monitoring) system. We show, using small-scale user studies, how it is possible to use the accelerometer and gyroscope data from a smartwatch to accurately separate eating episodes from similar non-eating activities, and to additionally identify the mode of eating (i.e., using a spoon, bare hands or chopsticks). Additionally, we investigate the likelihood of automatically triggering the smartwatch’s camera to capture clear images of the food being consumed, for possible offline analysis to identify what (and how much) the user is eating.

Our results show both the promise and challenges of this vision: while opportune moments for capturing such useful images almost always exist in an eating episode, significant further work is needed to both (a) correctly identify the appropriate instant when the camera should be triggered and (b) reliably identify the type of food via automated analyses of such images.

Share This Post