New A.I. system is better at reading human emotions than most people


Think you have a good poker face? Well, think again. This A.I. system might just give you a run for your money. Researchers have now developed a new technology that can read facial expressions better than most people — by using an intense, in-depth stare.

Machines capable of reading facial expressions aren’t exactly new. Microsoft debuted a viral tool capable of guessing your age based upon a single photo. Guess is the key word. Twenty-year-olds were enraged when the software guessed they looked forty, whereas older, middle-aged men were flattered when the software guessed they looked thirty.[1,2]

The new machine debuted by researchers is unprecedented in that it uses an algorithm to detect micro-expressions which usually escape the untrained eye. Researchers from the University of Finland tested the algorithm on an artificial intelligence (A.I.) system for the very first time. Their paper was released earlier this month and is pending publication.[1]

The machine detects what is called a facial micro-expression. They consist of subtle, facial movements, which unmask an individual’s facade and unveil what a person is really thinking or feeling. “Micro-expressions tend to occur when individuals hide their feelings under conditions of relatively high stakes,” MIT Technology Review reports.[1,2]

Stay up-to-date with the latest developments in A.I. by visiting AISystems.news powered by Fetch.news.

Putting the A.I. system to the test

In order to test the A.I. system, the researchers requested twenty people to watch an emotionally charged video without making any facial expressions. A long and banal questionnaire would be administered to anyone who failed to conceal his or her emotions. The team used a highly advanced camera that captured 100 frames per second as the participants watched the video. They then scrutinized the images to mark whenever a micro-expression occurred.[1,2]

The researchers extrapolated 164 frames of micro-expressions based upon the images. They labeled them as the correct emotion by comparing them to the images in the actual video. Using this archive, the researchers were able to teach the algorithm how to spot these micro-expressions. The machine then noted micro-expressions in the images that were different from the controlled images.

To pair each micro-expression with an emotion, the researchers utilized a method called motion magnification, which involves exaggerating tiny facial movements in order to make micro-expressions easily detectable.

Participants were shown images of the micro-expressions and requested to identify them based upon the raw video of the subjects’ faces. During these tests, the algorithm was shown to have been 10 percent better at identifying human emotions than their human counterparts.[2]

Nevertheless, human participants were still better at recognizing fleeting, emotion-revealing expressions in the first place. Participants outperformed the machine by approximately seven percent.[2]

Furthermore, the machine often misinterpreted certain body movements, like blinking, for a micro-expression. The researchers say the first test was “a very promising start.” They hope to improve the system by using advanced learning techniques.[2]

The algorithm has more potential uses than merely winning poker tournaments. If this technology is perfected, it could be used by employers to evaluate applicants following a job interview, or by law enforcement agencies, after questioning a suspected criminal.[2]

Calling out the machine’s bluff

It’s hard to be not skeptical of this technology, since it goes against the private, subjective nature of first person experience. At best, this technology can be used to infer correlations between micro-expressions and private mental states.

If a person makes a minor facial expression that is inferred to be a sign of disgust, yet the person reports feeling no symptoms of disgust, then that first person testimony would trump third person data. In other words, the gap between third-person behavior and first-person mentality cannot be bridged no matter how tightly those correlations are drawn.

Conclusion: The algorithm could be useful for some institutions, including law enforcement agencies and psychotherapy clinics. Nevertheless, micro-expressions aren’t always a sufficient means to determine the internal mental states of people. Researchers who claim otherwise should be called out for their bluff.

Sources:

[1] Gizmodo.com

[2] DailyMail.co.uk

value="Enter your email address here..." style=" border-radius: 2px; font: 14px/100% Arial, Helvetica, sans-serif; padding: .2em 2em .2em;" onfocus="if(this.value == 'Enter your email address here...') { this.value = ''; }" onblur="if(this.value == '') { this.value = 'Enter your email address here...'; }" />

style="display: inline-block;

outline: none;

cursor: pointer;

text-align: center;

text-decoration: none;

font: 14px/100% Arial, Helvetica, sans-serif;

padding: .2em 1em .3em;

text-shadow: 0 1px 1px rgba(0,0,0,.3);

-webkit-border-radius: .2em;

-moz-border-radius: .2em;

border-radius: .2em;

-webkit-box-shadow: 0 1px 2px rgba(0,0,0,.2);

-moz-box-shadow: 0 1px 2px rgba(0,0,0,.2);

box-shadow: 0 1px 2px rgba(0,0,0,.2);"

>



Comments
comments powered by Disqus

RECENT NEWS & ARTICLES