The decoding of social signals from nonverbal cues plays a vital role in social interactions. In this project we develop approaches for automatically extracting emotional signals from face videos. Specifically, we introduce a transfer learning framework for building person-specific models in order to improve the accuracy in facial expression recognition over generic models. As emotions are naturally correlated to physiological signals (e.g. heart and respiration rate), we also propose an approach for detecting the heart rate variations from skin color changes.
S. Tulyakov, X. Alameda-Pineda, E. Ricci, L. Yin, J. F. Cohn, and N. Sebe, “Self-Adaptive Matrix Completion for Heart Rate Estimation from Face Videos under Realistic Conditions,” in IEEE International Conference on Computer Vision and Pattern Recognition, 2016. DEMO
G. Zen, L. Porzi, E. Sangineto, E. Ricci, N. Sebe. “Learning Personalized Models for Facial Expression Analysis and Gesture Recognition. IEEE Trans. Multimedia 18(4): 775-788, 2016.