Human Emotion Representations: A Lens of AI and Neurocinematics

Emotions are crucial for human adaptation to natural and social environments as well as for overall well-being. Humans not only need the ability to perceive, evaluate, and regulate their own emotions but also need to construct emotional conceptual schemas that can be shared with others to support the recognition and understanding of others' emotions, thereby achieving effective social functioning. However, how does the human brain form stable emotional representations that can be shared among individuals? Can AI agents such as large language models (LLMs) have human-like emotions?

 

This talk will integrate evidence from cognitive psychology and neurocinematics-based studies to reveal human emotion representations and explore emotional recognition and reasoning performance of multimodal LLMs represented by GPT-4 and Claude-3. Finally, a neurocinematics-inspired research framework will be discussed to understand human emotions and intelligence.