News
Project Oxford Gets Emotional for 'Movember'
Project Oxford team releases public beta of emotion tool that recognizes a core set of emotional states, which developers can embed into .NET apps.
Microsoft's Project Oxford team late last week released a public beta version of an emotion tool that allows .NET applications to be able to recognize a set of emotional states. The announcement was part of a talk given by Chris Bishop, head of Microsoft Research Cambridge in the United Kingdom, who was keynoting a conference on the future of business and technolog and demonstrated the tool during the keynote.
Project Oxford (the project, not the team) was just a footnote to the major releases of the Visual Studio 2015 tools and platform that debuted at the Microsoft Build conference in April. It consists of a set of machine learning-based REST APIs that .NET developers will be able to readily use to add vision, speech, facial recognition, and language translation to apps. Late last month, the team released a public beta of an expanded set of language components.
The release last week is key to apps that will allow machines to replicate the human capability to "eight core emotional states – anger, contempt, fear, disgust, happiness, neutral, sadness or surprise – based on universal facial expressions that reflect those feelings," writes Allison Linn, a senior writer with Microsoft Research, in a blog about the deployment of the tool.
In a timely demonstration, she said that the tool has been used in a simple Microsoft app, MyMoustache, which adds moustaches to faces that the app recognizes -- via the emotion tool -- in photos submitted to the site. Microsoft worked with The Movember Foundation to create the MyMoustache app, which is being used to raise awareness of mens' health issues.
The emotion tool uses machine learning to train itself, using a set of pictures to recognize the emotional states. As the tool receives new pictures, it then uses the training data to identify specific emotions based on facial features. In a demonstration of the MyMoustache app, the app is able to determine the placement of a cartoon moustache by recognizing the size and location of the nose and mouth in a picture of a face.
For updates on Project Oxford and to gain access to the tools, go here.
About the Author
You Tell 'Em, Readers: If you've read this far, know that Michael Domingo, Visual Studio Magazine Editor in Chief, is here to serve you, dear readers, and wants to get you the information you so richly deserve. What news, content, topics, issues do you want to see covered in Visual Studio Magazine? He's listening at [email protected].