Whitepaper

Download the emteq labs white paper

This emteq labs white paper is the first in a series, providing insight for academic, clinical and market researchers, content and training specialists and all those who have an interest in emotion analytics. Written by the team at emteq labs, including content from our founders, Graeme Cox and Dr Charles Nduka, this white paper offers valuable insights into the capabilities and potential for the use of biometric feedback gathered within virtual reality, for both healthcare therapies and training.

This is a Heading

Download Now

This is a Heading

"Developing new treatments and training solutions requires an understanding of the range of ‘normal’ responses to interventions. This is particularly important for healthcare issues such as treating anxiety and depression. 

In the past, members of the public contributed to the human genome project, which in turn has enabled many new treatments to be developed. We hope that we will begin the process of understanding the range of behavioural responses that will act as a baseline for future research and the development of individualised treatments of mental health conditions."


Dr Charles Nduka, Chief Scientific Officer and co-founder of emteq labs

Benefits of VR for emotion analytics

Evaluating behaviour in VR provides many advantages over typical research set-ups, engendering more confidence in results. For example, it is well known that a range of contextual factors influence an individual’s emotional responses which include:
  • Prior experiences (priming effects)
  • Being looked at by others
  • The emotional responses of others (social contagion)
  • The responses of others to our behaviour
  • The appearance and behaviour or others
  • The presence of others
  •  Distractions

Biometric Feedback

Not all individuals experience emotions in the same way, yet traditional methods of measuring emotion either lack objectivity, ignore the context or are subject to subjectivity which affects the validity of the results.    


emteq lab’s latest Virtual Reality (VR) technology builds on over 40 years of research into the relationship between facial expression and emotion, measuring arousal (level of activation or excitement) valence (whether a reaction is positive, negative or neutral) and action. 


Their proprietary technology uses multimodal, biometric sensors within a virtual reality headset. By pairing this technology with VR immersive environments, emotional responses are measured in conditions that simulate real-world situations. The sensors within the headset monitor responses in heart rate and heart rate variability, facial electromyography (EMG), skin conductance, eye movements, and bodily motion to provide the most ecologically valid platform to quantify emotion more accurately and objectively.

White Paper Analysis Areas

Successful storytellers, creators, teachers and trainers understand the role emotion plays in driving engagement and influencing behaviours. The majority of commercial applications for emotion evaluation have been for testing audience responses.

EEG Brain Activity

Use of EEG to evaluate concentration based on asymmetry in the signals emanating from the front of the brain.

Eye tracking

Quantifying interest through measures of time around point of interest, length of gaze, repeat interest and the sequence of which objects are observed. 

Bodily movement

During peak emotional situations, individuals use bodily expressions to infer the positive and negative valence of the expressed emotion.

Facial expression

The use of fEMG to understand physiological activation in response to stimulii., known as Valence and Arousal, analysis of which utilises the Dimensional model.

Heart rate and Heart Rate Variability

Use of PPG to detect changes in Heart Rate and Heart Rate Variability to 'scale' emotional responses.

Skin response

Detecting stress reactions via changes in the skin which are manifested by alterations in blood flow and sweating.