Interactive System – Movement Annotation Interface
Collaboration with : Omid Alemi, Ankit Gupta, Thecla Schiphorst and Philippe Pasquier (SIAT – SFU)
We designed an annotation tool called MOTATE that is meant to gather Laban Movement Analysis expert annotation of motion data. We developed a first version of MOTATE for the reliability study that allowed participants to select appropriate annotation of movement excerpts observed in pairs of videos, prioritizing dominant features. Using the results of that system, we are now in the process of designing the second version of MOTATE. The second version is motivated by a strong need expressed in the Laban Movement Analysis community: movement experts are lacking digital tools to view movement recordings and save their annotations. We initiated a second version of MOTATE that provides an online annotation tool that allows users to view movement data recorded using various sensors from different perspectives and annotate the data according to Laban analytical framework.
When accessing MOTATE, the user selects the data to annotate as well as the nature of the display; MOTATE allows selecting two displays. Each can accommodate the variety of motion data that the movement database carries, including motion capture, physiological data, and video recordings. For example, user can choose to view the data via two video displays if the movement was video recorded from different perspectives. MOTATE provides various video play functionality (e.g. scrubbing through a time-line). Once the user plays the data, she can annotate the data in the Laban analytical framework and save the annotation into a database directly. In order to annotate the data, the user needs to select segments of movement on three different tracks. The annotation tracks are ranked based on the perceived dominance of the observed movement element. Segments of movement selected cannot overlap in the same annotation track but can overlap across the three annotation tracks.