The virtual scenario was designed using 3DS Max 2017 and implemented in Unity v5.3. The virtual avatars were created using Iclone 7 and implemented in Unity. The scenario was presented by means of HMD Oculus Rift (https://www.oculus.com/). In order to realize naturalistic movements, we used Xsense motion capture suits (https://www.xsens.com/) to record the kinematics of an actor gently caressing different body parts of another actor seated on a beach chair with the right hand. The actor was instructed to perform the caress at a speed of about 3 cm/s. The actors’ kinematics were transferred on the virtual avatar by means of Motion Builder 2015 and rendered in Unity. In this way, participants observed the same kinematics implemented on the virtual touchers. Moreover, through recorded kinematics, we were able to keep the toucher’s movements constant and control for other emotional interference that can be conveyed in traditional experimental settings by confederates through other nonverbal cues62.
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.