About the Presentation: Automating the creation of a personal food-consumption diary can positively influence food-intake monitoring research. One possible direction for automating food-dairy creation is the use of wrist-worn devices, such as smartwatches. Modern smartwatches are equipped with various sensors that can assist in monitoring hand movement and, in turn, determining whether that movement corresponds to eating-related behavior. Furthermore, some smartwatches are equipped with cameras that could be triggered to capture images of the food.
In this talk, I describe the design and implementation of a smartwatch-based, unobtrusive, food-diary system, where the smartwatch assists in intelligently capturing useful images of food that an individual consumes throughout the day. The overall system is based on three key components: (a) a smartwatch-based gesture recognizer to identify eating gestures, (b) a smartwatch-based image-capture tool that obtains a small set of relevant and useful images (containing clear views of the food being consumed) with a low energy overhead, and (c) a server-based image-filtering engine that removes irrelevant uploaded images, and then catalogs them through a portal. In addition to describing each of these components, I describe studies of subjects using the system and the lessons learnt from these studies.