The POETICON enacted scenario corpus –- A tool for human and computational experiments on action understanding

ΤίτλοςThe POETICON enacted scenario corpus –- A tool for human and computational experiments on action understanding
Publication TypeConference Papers
Year of Publication2011
AuthorsWallraven, C, Schultze, M, Mohler, B, Vatakis, A, Pastra, K
Conference NameIEEE International Conference on Automatic Face Gesture Recognition (FG 2011)
Date PublishedMarch
Λέξεις κλειδιάaction understanding, Animation, cameras, Cleaning, cognition, cognitive science, computer animation, computer vision, data corpus, Databases, HD camera, human-human interaction, Humans, image motion analysis, kinematic analysis, Kinematics, kitchen setting, living room setting, motion capture data access, object interaction, POETICON, spoken dialogue, Tracking

A good data corpus lies at the heart of progress in both perceptual/cognitive science and in computer vision. While there are a few datasets that deal with simple actions, creating a realistic corpus for complex, long action sequences that contains also human-human interactions has so far not been attempted to our knowledge. Here, we introduce such a corpus for (inter)action understanding that contains six everyday scenarios taking place in a kitchen / living-room setting. Each scenario was acted out several times by different pairs of actors and contains simple object interactions as well as spoken dialogue. In addition, each scenario was first recorded with several HD cameras and also with motion-capturing of the actors and several key objects. Having access to the motion capture data allows not only for kinematic analyses, but also allows for the production of realistic animations where all aspects of the scenario can be fully controlled. We also present results from a first series of perceptual experiments that show how humans are able to infer scenario classes, as well as individual actions and objects from computer animations of everyday situations. These results can serve as a benchmark for future computational approaches that begin to take on complex action understanding.

Citation Key5771446