Monday, April 21, 2014
We can finally rest peacefully. The long promised advent of robots that can feel has finally arrived. Get ready to pet your Roomba. At least if you want to keep it purring contently on your carpet.
Researchers at Ohio State University, home of the fighting Buckeyes, have managed to teach a computer to distinguish 21 different emotions. They've done so by having it learn different facial recognition patterns.
In case you yourself didn't think you knew 21 different emotions consider that a couple of those they taught the computer were composites, like "happily disgusted" or "sadly angry."
Why does it sound like they got those from Japanese anime cartoons?
In any event, it's a pretty significant achievement, not least because although earlier facial recognition attempts had focused on the six main emotional expressions, the reality is we all express complex compound emotions every day. It's like the difference between seeing a painting with primary colors and watching a film of all the subtle shades in between.
Which may also explain why some people are emotionally stunted. They are just bad emotion readers. Stuck in the emotional knowledge equivalent of comic books when other folks are delving into emotional Tolstoy.
Personally, I think this could help. Like an emotional prosthesis for the clueless. They could have an app for your smartphone. Which could become your emo-phone or your sensi-tone. Better yet, incorporate the software into Google Glass.
Which then could come full circle to the tech world, as computer nerd tech types are legendary for being emotionally tone deaf. And what dumbass guy wouldn't appreciate a little help about when to not try to solve his loved one's problems but "just be there for her."
“Is that a happily disgusted face on you or are you sadly angry Honey?”
America, ya gotta love it.