Automated analysis of interactional synchrony using robust facial tracking and expression recognition
Name:
YuZhangYuDunbarJensenBurgoonMe ...
Size:
902.4Kb
Format:
PDF
Description:
Final Accepted Manuscript
Author
Yu, XiangZhang, Shaoting
Yu, Yang
Dunbar, Norah
Jensen, Matthew
Burgoon, Judee K.
Metaxas, Dimitris N.
Issue Date
2013-04
Metadata
Show full item recordPublisher
IEEECitation
X. Yu et al., "Automated analysis of interactional synchrony using robust facial tracking and expression recognition," 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG), Shanghai, 2013, pp. 1-6, doi: 10.1109/FG.2013.6553802.Journal
2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG)Rights
Copyright © 2013, IEEE.Collection Information
This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.Abstract
In this paper, we propose an automated, data-driven and unobtrusive framework to analyze interactional synchrony. We use this information to determine whether interpersonal synchrony can be an indicator of deceit. Our framework includes a robust facial tracking module, an effective expression recognition method, synchrony feature extraction and feature selection methods. These synchrony features are used to learn classification models for the deception recognition. To evaluate our proposed framework, we have conducted extensive experiments on a database of 242 video samples. We validate the performance of each technical module in our framework, and also show that these synchrony features are very effective at detecting deception.eISBN
9781467355469Version
Final accepted manuscriptae974a485f413a2113503eed53cd6c53
10.1109/fg.2013.6553802