Psychological Process
Research Team

Research Summary

We conduct research to elucidate the computational mechanisms of human mind (emotions, cognition, and behavior) to develop robots with mind.

Main Research Fields
  • Experimental psychology
  • Basic/Social brain science
  • Intelligent robotics
Keywords
  • Emotion
  • Facial expression
  • Social interaction
  • Human-robot interaction
  • Neuroimaging
Research theme
  • Computational elucidation of human mind; its implementation in robots collaborating with engineers.
  • Psychological evaluation of robots' functions.
  • Interdisciplinary research across psychology, informatics, and robotics, especially on emotional communication.

Wataru Sato

Wataru Sato

History

2005
Primate Research Institute, Kyoto University
2010
Hakubi Center, Kyoto University
2014
Graduate School of Medicine, Kyoto University
2017
Kokoro Research Center, Kyoto University
2020
RIKEN

Award

2011
Award for Distinguished Early and Middle Career Contributions, The Japanese Psychological Association

Members

HSU Chun-Ting
Research Scientist
Akie Saito
Research Scientist
Shushi Namba
Research Scientist
Koh Shimokawa
Technical Staff I
Masaru Usami
Research Part-time Worker II
Saori Namba
Research Part-time Worker I
Yang Dongsheng
Student Trainee
Naoya Kawamura
Student Trainee

Former member

Rena Kato
Research Part-time Worker II (2020/07-2022/01)

Research results

Emotional valence sensing using a wearable facial EMG device

(Sato, Murata, Uraoka, Shibata, Yoshikawa, & Furuta: Sci Rep)

Emotion sensing using physiological signals in real-life situations can be practically valuable. Previous studies developed wearable devices that record autonomic nervous system activity, which reflects emotional arousal. However, no study has determined whether emotional valence can be assessed using wearable devices.

To this end, we developed a wearable device to record facial electromyography (EMG) from the corrugator supercilii (CS) and zygomatic major (ZM) muscles.

To validate the device, in Experiment 1, we used a traditional wired device and our wearable device, to record participants' facial EMG while they were viewing emotional films.

Participants viewed the films again and continuously rated their recalled subjective valence during the first viewing. The facial EMG signals recorded using both wired and wearable devices showed that CS and ZM activities were, respectively, negatively and positively correlated with continuous valence ratings.

In Experiment 2, we used the wearable device to record participants' facial EMG while they were playing Wii Bowling games and assessed their cued-recall continuous valence ratings. CS and ZM activities were correlated negatively and positively, respectively, with continuous valence ratings.

These data suggest the possibility that facial EMG signals recorded by a wearable device can be used to assess subjective emotional valence in future naturalistic studies.

Emotional valence sensing using a wearable facial EMG device

Enhanced emotional and motor responses to live vs. videotaped dynamic facial

(Hsu, Sato, & Yoshikawa: Sci Rep)

Facial expression is an integral aspect of non-verbal communication of affective information. Earlier psychological studies have reported that the presentation of prerecorded photographs or videos of emotional facial expressions automatically elicits divergent responses, such as emotions and facial mimicry.

However, such highly controlled experimental procedures may lack the vividness of reallife social interactions.

This study incorporated a live image relay system that delivered models’ real-time performance of positive (smiling) and negative (frowning) dynamic facial expressions or their prerecorded videos to participants. We measured subjective ratings of valence and arousal and facial electromyography (EMG) activity in the zygomaticus major and corrugator supercilii muscles.

Subjective ratings showed that the live facial expressions were rated to elicit higher valence and more arousing than the corresponding videos for positive emotion conditions. Facial EMG data showed that compared with the video, live facial expressions more effectively elicited facial muscular activity congruent with the models’ positive facial expressions.

The findings indicate that emotional facial expressions in live social interactions are more evocative of emotional reactions and facial mimicry than earlier experimental data have suggested.

Enhanced emotional and motor responses to live vs. videotaped dynamic facial

Selected Publications

  1. Saito, A., Sato, W., and Yoshikawa, S.:
    "Rapid detection of neutral faces associated with emotional value"
    Cognition and Emotion, 36, 546-559 (2022).
  2. Sato, W., Namba, S., Yang, D., Nishida, S., Ishi, C., and Minato, T.:
    "An android for emotional interaction: Spatiotemporal validation of its facial expressions"
    Frontiers in Psychology, 12, 800657 (2022).
  3. Sawabe, T., Honda, S., Sato, W., Ishikura, T., Kanbara, M., Yoshikawa, S., Fujimoto, Y., and Kato, H.:
    "Robot touch with speech boosts positive emotions"
    Scientific Reports, 12, 6884 (2022).
  4. Uono, S., Sato, W., Kochiyama, T., Yoshimura, S., Sawada, R., Kubota, Y., Sakihama, M., and Toichi, M.:
    "The structural neural correlates of atypical facial expression recognition in autism spectrum disorder"
    Brain Imaging and Behavior, 16, 1428-1440 (2022).
  5. Sato, W., Ikegami, A., Ishihara, S., Nakauma, M., Funami, T., Yoshikawa, S., and Fushiki, T.:
    "Brow and masticatory muscle activity senses subjective hedonic experiences during food consumption"
    Nutrients, 13, 4216 (2021).
  6. Namba, S., Sato, W., Osumi, M., and Shimokawa, K.:
    "Assessing automated facial action unit detection systems for analyzing cross-domain facial expression databases"
    Sensors, 21, 4222: 1-18 (2021).
  7. Sato, W., Usui, N., Sawada, R., Kondo, A., Toichi, M., and Inoue, Y.:
    "Impairment of emotional expression detection after unilateral medial temporal structure resection"
    Scientific Reports, 11, 20617 (2021).
  8. Nishimura, S., Nakamura, T., Sato, W., Kanbara, M., Fujimoto, Y., Kato, H., and Hagita, N.:
    "Vocal synchrony of robots boosts positive affective empathy" Applied Sciences, 11, 2502 (2021).
  9. Sato, W., Murata, K., Uraoka, Y., Shibata, K., Yoshikawa, S., and Furuta, M.:
    "Emotional valence sensing using a wearable facial EMG device"
    Scientific Reports, 11, 5757 (2021).
  10. Hsu, C.-T., Sato, W., and Yoshikawa, S.:
    "Enhanced emotional and motor responses to live vs. videotaped dynamic facial expressions"
    Scientific Reports, 10, 16825 (2020).

Links

Contact Information

wataru.sato.ya [at] riken.jp