Automated analysis of interactional synchrony using robust facial tracking and expression recognition
Burgoon, Judee K.
Metaxas, Dimitris N.
MetadataShow full item record
CitationX. Yu et al., "Automated analysis of interactional synchrony using robust facial tracking and expression recognition," 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Shanghai, 2013, pp. 1-6, doi: 10.1109/FG.2013.6553802.
Journal2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG)
RightsCopyright © 2013, IEEE.
Collection InformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at firstname.lastname@example.org.
AbstractIn this paper, we propose an automated, data-driven and unobtrusive framework to analyze interactional synchrony. We use this information to determine whether interpersonal synchrony can be an indicator of deceit. Our framework includes a robust facial tracking module, an effective expression recognition method, synchrony feature extraction and feature selection methods. These synchrony features are used to learn classification models for the deception recognition. To evaluate our proposed framework, we have conducted extensive experiments on a database of 242 video samples. We validate the performance of each technical module in our framework, and also show that these synchrony features are very effective at detecting deception.
VersionFinal accepted manuscript