Studies aimed at understanding the meaning of the signals that our bodies are capable of manifesting are very old. Since time immemorial, attempts have been made to grasp the secret behind the gaze, that set of emotions that the expressions of our face convey, so important for our social relationships and yet not easy to decode. To find the first scientific contribution in this field, however, we have to wait until 1872, when Charles Darwin published The Expression of the Emotions in Man and Other Animals. It is the first contribution that defines the behavioural manifestation of certain emotions as ‘innate’ – and therefore not conditioned by culture, but the result of evolution. It is the beginning of a series of studies that will culminate with Paul Ekman’s research, between the 1960s and 1970s, which will show how some of themain emotions – anger, fear, happiness, sadness, contempt, disgust and surprise – are represented by the same facial muscles in all peoples of the world.
The human face can assume more than 10,000 different expressions, many of which have no particular meaning, they are mere muscular actions without any correlation of meaning. Some, however, are decisive and meaningful from an expressive point of view. This data is the basis of the Facial Action Coding System (FACS), the coding system of facial muscle movements created in 1978 by Paul Ekman and Wallace V. Frisen. It is the first atlas of the human face that includes a systematic description (text, photographs and video footage) for measuring facial movements in anatomical terms, breaking them down into individual movement units called action units.
“Emotion Algorithm” is a coherent definition for a method that allows any movement of the human face to be identified in a purely descriptive manner, thus free from any possible interpretative inference. The FACS facial expression coding, analysis and intensity measurement system has established itself as the most effective and comprehensive tool, widespread in the scientific community and used in FBI and other security-related training programmes, the subject of much applied research and publications. A FACS-certified encoder is able to study every conformation assumed by the face, as a starting point for analysing facial expressions and decoding them. Emotions activate involuntary circuits: this is the reason why it is very difficult to hide an emotion. Some muscles are nevertheless activated, even if only for a brief instant. These are the facial micro-expressions, manifestations that have a very short duration, between a fifth and a twenty-fifth of a second, of which we are hardly ever aware, but which leak valuable information for those who are able to read these automatisms.
The professional trained in this specific skill is an expert capable of analysing people’s facial expressions. The applications of this tool are manifold and go well beyond its use in the field of security, as it represents an interesting enhancement of competence and awareness for those involved in selection, coaching and, more generally, for those who exercise professions that have to do with helping relationships. In short, in all contexts where understanding relationships, communication and emotions becomes not only necessary but a priority.
These examples allow us to make an interesting reflection regarding the future of the world of work and training. Many companies are investing in software capable of reading and decoding facial expressions; these are technologies linked to the development of artificial intelligence. We can compare this moment in history to the time when we invented calculators to replace the human mind in performing complex mathematical operations. But interpreting the human face is not, or at least not yet, a purely calculating operation: the algorithm of emotions still needs human guidance and expertise. We cannot go into the subject in depth here, but we can point out that Paul Ekman himself is convinced that to this day man is still the most reliable ‘tool’ in the analysis of facial expressions, since he can contextualise and interpret human facial expressions better than software.
Understanding emotions therefore seems to be still a very ‘human’ affair, and this casts a positive light on the possibility of exploring and exploiting new skills in the future.