Syndicate content

Computer Science

Media (R)evolutions: Now, computers can tell how you're feeling

Roxanne Bauer's picture

New developments and curiosities from a changing global media landscape: People, Spaces, Deliberation brings trends and events to your attention that illustrate that tomorrow's media environment will look very different from today's, and will have little resemblance to yesterday's.

Imagine watching a commercial, and the TV or mobile phone on which you are watching immediately knows if you’d like to buy the product being advertised.  Imagine feeling stressed out while driving, and your car automatically starts talking to you and adjusting the air and radio controls. Or imagine a video or film that changes the storyline based on your reactions to characters. This is the future, in which devices react not just to our behavioral and physiological clues, but also to our emotions.
 
Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human the emotional states of humans. It is an interdisciplinary field spanning computer science, psychology, and cognitive science.   
 
Affective Computing


Most of the software in the field of affective tracks emotions, like happiness, confusion, surprise, and disgust, by scanning an environment for a face and identifying the face’s main regions—mouth, nose, eyes, eyebrows.  The software then ascribes points to each and tracks how other points move in relation to one another. Shifts in the texture of skin, such as wrinkles, are also tracked and combined with the information on facial points. Finally, the software identifies an expression by comparing it with those it has previously analyzed.