On Wednesday, Microsoft Research revealed new Project Oxford tools, including one that can recognize emotion.
"Humans have traditionally been very good at recognizing emotions on people's faces, but computers? Not so much," said Microsoft senior writer for research Allison Linn in a blog post on the new Project Oxford capabilities. "That is, until now. Recent advances in the fields of machine learning and artificial intelligence are allowing computer scientists to create smarter apps that can identify things like sounds, words, images - and even facial expressions." From Wednesday, developers can get their hands on the public beta versions of these tools, including the emotion tool. Chris Bishop, head of Microsoft Research Cambridge in the UK, showed off the emotion tool at the company's Future Decoded conference on Wednesday.
"In the case of something like facial recognition, the system can learn to recognize certain traits from a training set of pictures it receives, and then it can apply that information to identify facial features in new pictures it sees," explained Linn. "The emotion tool released today can be used to create systems that recognize eight core emotional states - anger, contempt, fear, disgust, happiness, neutral, sadness or surprise - based on universal facial expressions that reflect those feelings." The technology can also be used to group collections of photos based on the people in them, or even to recognize and rate facial hair, such as in Microsoft's MyMoustache rater for the Movember fundraising effort. Microsoft plans to release public beta versions of several more new tools by the end of the year, including spell check, video, speaker recognition, custom recognition intelligent services, and updates to face APIs. You can check out the emotion demo here.
Photo: © Microsoft.