Reddington and Tintarev from the University of Aberdeen have published a
paper on an interesting application of lifelogging: helping people with
disabilities that make speech difficult.
Despite advances in [Augmented and Alternative Communication] devices, creating sentences ‘on the fly’ for spontaneous conversation is still slow and
difficult … In the case of severe difficulties, typing may not be possible at all. This results in a situation where new utterances must be prepared in advance either by the user or a carer, with a large time and energy cost. Recent or single use events, such as talking about one’s day or talking about yesterday’s television, are expensive to prepare in advance relative to the limited potential for future (re-)use. As a result, AAC users tend to be passive, responding to questions with single words or short sentences.
The idea in this project is to use lifelogging sensors to partial
automate story-telling. Location data, voice recordings, RFID, and the log of
previous utterences are to be mined to help tell one’s story.
The paper is Automatically Generating Stories From Sensor Data published at IUI ’11