Automatic story-telling – a new twist

Reddington and Tintarev from the University of Aberdeen have published a
paper on an interesting application of lifelogging: helping people with
disabilities that make speech difficult.

Despite advances in [Augmented and Alternative Communication] devices, creating sentences ‘on the fly’ for spontaneous conversation is still slow and
difficult … In the case of severe difficulties, typing may not be possible at all. This results in a situation where new utterances must be prepared in advance either by the user or a carer, with a large time and energy cost. Recent or single use events, such as talking about one’s day or talking about yesterday’s television, are expensive to prepare in advance relative to the limited potential for future (re-)use. As a result, AAC users tend to be passive, responding to questions with single words or short sentences.

The idea in this project is to use lifelogging sensors to partial
automate story-telling. Location data, voice recordings, RFID, and the log of
previous utterences are to be mined to help tell one’s story.

The paper is Automatically Generating Stories From Sensor Data published at IUI ’11


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s