This paper considers the problem of recognizing eating gestures by tracking wrist motion. Eating gestures are activities commonly undertaken during the consumption of a meal, such as sipping a drink of liquid or using utensils to cut food. Each of these gestures causes a pattern of wrist motion that can be tracked to automatically identify the activity. Previous works have studied this problem at the level of a single gesture. In this paper, we demonstrate that individual gestures have sequential dependence. To study this, three types of classifiers were built: 1) a K-nearest neighbor classifier which uses no sequential context, 2) a hidden Markov model (HMM) which captures the sequential context of subgesture motions, and 3) HMMs that model intergesture sequential dependencies.
We built first-order to sixth-order HMMs to evaluate the usefulness of increasing amounts of sequential dependence to aidrecognition. On a dataset of 25 meals, we found that the baseline accuracies for the KNN and the subgesture HMM classifiers were 75.8% and 84.3%, respectively. Using HMMs that model intergesture sequential dependencies, we were able to increase accuracy to up to 96.5%. These results demonstrate that sequential dependencies exist between eating gestures and that they can be exploited to improve recognition accuracy.