This is a speculative project that investigates and critiques the emotion recognition technology driven by Artificial Intelligence. I questioned A.I.'s understanding of human beings' emotions, challenged the idea of 'affective computing' and revealed the implications of this technology. In this project, I designed a program that trains humans to standardize their 'emotions' or 'expressions' so that they can be better understood by A.I.

Time / 

03/2018

Skills & Tools / 

Unity, Open CV, Affectiva

Type / 

Individual Project

How does A.I. understand

human beings' emotions?

It seems magical that the A.I. can read your minds. But in fact, it only reads numbers. With the intention to better understand and serve users, 'affective computing' aims to read users' emotions and effectively respond to them.

What's the problem here?

Users need to understand A.I.'s capabilities in order to use them. Successful interactions with Siri requires users to have a basic knowledge about what Siri can do or can not do. While using, people are adapting to A.I. all the time with its affordance. 

This co-adaptation process exists in emotion recognition as well. When your smile can be a trigger/commend for A.I., you want to make sure your smile can be best recognized by machines.

Humans think machines learned from them,

but humans also learned from machines.

Your A.I will serve you well 

only if you know it really well.

Aiming to critique this nature of A.I, especially emotion recognition, I made a training program for users' emotions to be better read by machines. The best way to do it is to quantify your emotions

Training Starts Now ...

Level 1: Facial Action Training

The basic training starts with standardizing your facial actions, such as eyebrow raise and frown.

Level2: Emotion Traning

A.I. categorize emotions by calculating numbers got from facial actions. Thus, different emotions could be quantified into formulas of facial actions.

Level3: Customizable Training

The A.I. can learn personal emotions from users,

such as the emotion you have when you dislike a certain song.

It will learn from users and gather data, 

and use the same dataset to re-train users.

Can/Should A.I. read our real emotions?

When emotions are never emotional, but more like programmed triggers of different functions, the face itself became the interface. The better you got trained, the better you can communicate and interact with your A.I.

This project is featured in the "Useless A.I. Symposium"  in 2018.

A web-based live demo site is under construction. Please check back soon.