TY - GEN
T1 - How to address smart homes with a social robot? A multi-modal corpus of user interactions with an intelligent environment
AU - Holthaus, Patrick
AU - Leichsenring, Christian
AU - Bernotat, Jasmin
AU - Richter, Viktor
AU - Pohling, Marian
AU - Carlmeyer, Birte
AU - Köster, Norman
AU - Zu Borgsen, Sebastian Meyer
AU - Zorn, René
AU - Schiffhauer, Birte
AU - Engelmann, Kai Frederic
AU - Lier, Florian
AU - Schulz, Simon
AU - Cimiano, Philipp
AU - Eyssel, Friederike
AU - Hermann, Thomas
AU - Kummert, Franz
AU - Schlangen, David
AU - Wachsmuth, Sven
AU - Wagner, Petra
AU - Wrede, Britta
AU - Wrede, Sebastian
PY - 2016/1/1
Y1 - 2016/1/1
N2 - In order to explore intuitive verbal and non-verbal interfaces in smart environments we recorded user interactions with an intelligent apartment. Besides offering various interactive capabilities itself, the apartment is also inhabited by a social robot that is available as a humanoid interface. This paper presents a multi-modal corpus that contains goal-directed actions of naive users in attempts to solve a number of predefined tasks. Alongside audio and video recordings, our data-set consists of large amount of temporally aligned sensory data and system behavior provided by the environment and its interactive components. Non-verbal system responses such as changes in light or display contents, as well as robot and apartment utterances and gestures serve as a rich basis for later in-depth analysis. Manual annotations provide further information about meta data like the current course of study and user behavior including the incorporated modality, all literal utterances, language features, emotional expressions, foci of attention, and addressees.
AB - In order to explore intuitive verbal and non-verbal interfaces in smart environments we recorded user interactions with an intelligent apartment. Besides offering various interactive capabilities itself, the apartment is also inhabited by a social robot that is available as a humanoid interface. This paper presents a multi-modal corpus that contains goal-directed actions of naive users in attempts to solve a number of predefined tasks. Alongside audio and video recordings, our data-set consists of large amount of temporally aligned sensory data and system behavior provided by the environment and its interactive components. Non-verbal system responses such as changes in light or display contents, as well as robot and apartment utterances and gestures serve as a rich basis for later in-depth analysis. Manual annotations provide further information about meta data like the current course of study and user behavior including the incorporated modality, all literal utterances, language features, emotional expressions, foci of attention, and addressees.
KW - Interaction corpus
KW - Smart home
KW - Social robot
UR - http://www.scopus.com/inward/record.url?scp=85013172227&partnerID=8YFLogxK
UR - http://www.lrec-conf.org/proceedings/lrec2016/index.html
M3 - Conference contribution
AN - SCOPUS:85013172227
T3 - Proceedings of the 10th International Conference on Language Resources and Evaluation, LREC 2016
SP - 3440
EP - 3446
BT - Proceedings of the 10th International Conference on Language Resources and Evaluation, LREC 2016
A2 - Calzolari, Nicoletta
A2 - Choukri, Khalid
A2 - Mazo, Helene
A2 - Moreno, Asuncion
A2 - Declerck, Thierry
A2 - Goggi, Sara
A2 - Grobelnik, Marko
A2 - Odijk, Jan
A2 - Piperidis, Stelios
A2 - Maegaard, Bente
A2 - Mariani, Joseph
PB - European Language Resources Association (ELRA)
T2 - 10th International Conference on Language Resources and Evaluation, LREC 2016
Y2 - 23 May 2016 through 28 May 2016
ER -