Logo eprints

Exploring the Behavioral and Neural Underpinnings of Nonverbal Affective Communication: Insights from Facial Expressions, and Human Vocalizations in Distinct Vigilance States

Grollero, Demetrio (2024) Exploring the Behavioral and Neural Underpinnings of Nonverbal Affective Communication: Insights from Facial Expressions, and Human Vocalizations in Distinct Vigilance States. Advisor: Bernardi, Prof. Giulio. Coadvisor: Cecchetti, Prof. Luca . pp. 229. [IMT PhD Thesis]

[img] Text (Doctoral thesis)
Grollero_final_Thesis_Full.pdf - Published Version
Restricted to IMT staff and National library only until 31 September 2026.
Available under License Creative Commons Attribution Non-commercial Share Alike.

Download (10MB) | Request a copy

Abstract

This Thesis offers a thorough exploration concerning the behavioral and brain functional bases of nonverbal affective communication, with a particular focus on facial expressions and human vocalizations. The processing of nonverbal human vocalizations, in particular, was explored during both wakefulness and sleep, thus giving us the opportunity to get novel insight into the physiological mechanism responsible for sensory disconnection during sleep. The Thesis encompasses three distinct studies. In the first study, we developed a facial motion tracking procedure for the examination of spontaneous affective expressions and validated it in a group of healthy adult individuals. This tool allowed us to capture the diversity of subjective emotional states beyond predefined categories. Utilizing naturalistic video stimuli and data-driven analytic techniques, this study identified low-dimensional descriptors of facial configurations that map subjective experience across individuals, suggesting a potentially universal aspect of emotional expression. The second study investigated how taking different perspectives (core affect, CA, and perceived affective qualities, PAQ, ratings) influenced the evaluation of emotional content conveyed through nonverbal human vocalizations. Results showed a V-shaped relationship between assessed arousal and valence, consistent with previous findings in visual, olfactory, and auditory stimuli. While no significant average differences emerged between arousal and valence between conditions, there might be variations in the magnitude and variability of ratings between CA and PAQ evaluation. These findings suggest the importance of careful consideration when designing experimental instructions and imply that nonverbal emotional communication may share a common semantic representation across sensory modalities. The third study deepened into brain functional responses to affective human vocalizations during both wakefulness and (NREM) sleep. Brain activity patterns were analyzed using high-density EEG. We showed that vocal bursts entailing positive or negative valence were processed differently from neutral stimuli across vigilance states. This finding supports the notion of a 'sentinel' mechanism that is active during sleep, whereby the brain remains sensitive to salient cues in the surroundings of the sleeper. Collectively, the described studies provided new insights into the complex mechanisms underlying non-verbal human communication encompassing emotion expression and recognition. All the performed studies relied on naturalistic stimuli, which better reflect the complexity of real-world emotional experiences with respect to artificial, simplified stimuli often used in cognitive studies exploring brain physiology. Our approaches and findings underscore the importance of studying affective communication in ecologically valid contexts, laying a foundation for further exploration in this captivating field.

Item Type: IMT PhD Thesis
Subjects: R Medicine > RC Internal medicine
PhD Course: Cognitive, Computational and Social Neurosciences
Identification Number: https://doi.org/10.13118/imtlucca/e-theses/425/
NBN Number: urn:nbn:it:imtlucca-30573
Date Deposited: 13 Sep 2024 07:18
URI: http://e-theses.imtlucca.it/id/eprint/425

Actions (login required, only for staff repository)

View Item View Item