Researching an unobtrusive mood recognition system based on postural data – and filing a patent on headbanging

Philips Research, Eindhoven 2010

the problem

Enabling novel audio experiences by analysing the impact of music on mood via posture 

What if your work environment could detect your mood and support/change it? What if your computer could potentially warn you not to make big decisions when it senses you are not in the correct mood state? The goal of this project was to investigate the feasibility of such a system: an unobtrusive (i.e. users are not hooked up to heart rate or skin conductivity monitors) mood recognition system.  While studying body postures in relation to mood is not a new concept, none of the studies had focused on natural postures in a natural environment over a long course of time. They had only focused on overexaggerated actor portrayals during  short (2-3 second) intervals.  We utilized mood inducing music and studied the effect on the naturalistic displays of certain body posture features over 8 minutes.  The correlation data would then inform design decisions on building a device that would detect a user's mood via the display of body posture features. 


Breakdown of the total time in seconds for the head tapping/no tapping features were displayed during Negative Low (NL) and Positive High (PH) mood induction.

My role

I developed and validated a coding system that measured the temporal changes of six different postural dimensions and movement features

Since this was a novel research arena, I first needed to understand 1) which body posture were relevant and 2) how I would measure them. I also had the added constraint of the video data: the angle and capture limited the features that could be included. After scouring previous mood/posture research papers, I developed  and validated a coding system.

The final group of postural data included ratings for head, shoulders, trunk, arms, head tapping, and hand tapping features.  The total time each posture was displayed per minute for each video was exported and analyzed using my new best friend, SPSS.

Understanding the data

The head proved to be one of the first and most reliable regions to differentiate a user’s mood

Now the fun part: the results showed that the majority of the postural features investigated were displayed at a statistically significant level during the two mood states (happy vs. sad).  Furthermore, the head tapping feature was exclusively shown during the positive high energy mood state- which means that one could deduce the mood of the user solely by the display of this feature! A patent was filed in relation to this finding- informing the design of products that could detect the users mood through headbanging. 

There is still a lot to learn in relation to posture and unobtrusive mood detection. This study was the first step to understanding if we can detect natural postures in a natural environment and how we could combine these measures to provide a very powerful recognition accuracy.  

Mood Recognition Based on Upper Body Posture and Movement Features



A patent on headbanging! 


These exciting, groundbreaking results were published in the Affective Computing and Intelligent Interactions conference in 2011. Beyond that, I filed a patent and was granted the patent as main inventor/author.


Related projects 

Get in touch

If you are interested in working with me or would just like to say hello, please get in touch. Looking forward to hearing from you!