Different Mocap was carried out for the project. We will progressively open the access to the different resulting database.
3 context of interactions was studied :
- Theater : The database “Marine” contains gestures about marine activities performed by two actress with different expressivity. Another database, named “Imitation” provides gestures encountered when two actors play an exercice of improvisation named “imitation game”. These two database were carried out by the Lab-STICC at the european center for virtual reality (CERV – www.cerv.fr)
- Fitness : There are 4 database of fitness gesture. 3 of them were carried out by the LIMSI lab to study the link between expressivity and emotion and the last one was carried out by the Lab-STICC to define a virtual fitness companion.
- Magician : This database contains different gestures relative to the performance of a magician. There are dedicated to the synthesis of gesture carried out by the IRISA lab.
Details about the “Marine” database :
MoCap recordings for the INGREDIBLE database were carried out with the aid of two professional actresses from the theatre company Derezo. The actresses were located in different rooms, and only able to see each other’s avatars. In order to make the database more widely usable, the database utilizes two different MoCap suits and systems to collect data: Art-track and Moven. We have two main reasons for requiring a MoCap database: firstly, the database will be used to develop feature analysis tools able to recognize users’ gestures; secondly, MoCap recordings are necessary for the animation of a virtual agent. We also recorded synchronised videos of the two actresses while interacting in order to annotate their movements and find cues of dynamic coupling.
There are several types of recordings in the database. Recordings were either non-interactive or interactive. Non-interactive means that the actresses did not interact, and so no avatar was displayed in front of them. Instead, they were asked to perform a series of predefined gestures, repeating each one with variations in three dimensions. The modification criteria were amplitude (narrow, medium, wide), speed (slow, medium, fast), and fluidity (staccato, medium, fluid). In the interactive approach, the actresses communicated with each other through their human-size avatars, which were displayed on a screen in front of them. They were introduced to this environment by being encouraged to interact freely for as long as they wished. These first recordings often provided us with very interesting and spontaneous data, but the actresses rapidly grew bored without a prescribed task to perform. To add artistic, gestural, and expressive details to the interactions (according to the requirements of the project), we defined two interacting situations: 1.) imitation and 2.) bodily emotional dialogue.
The resulting dataset consists of 114 different recordings (see the table), 57 captured by each suit, with more than 150 hours of recording and 27 GB of data. The database stores recorded Art-track and Moven data converted to the .bvh format. It also holds Art-track raw data in .txt files and Moven .mvn data.
Details about the “Fitness” database :
Details about the “Imitation” database :
Details about the “Magician” database :