Microsoft abolishes controversial facial recognition tool to 'read emotions'



Microsoft has announced that it will discontinue some features of facial recognition services that read age, gender, facial expressions, and more. These 'reading people's emotions' have been controversial about legitimacy and privacy both inside and outside Microsoft.

Microsoft's framework for building AI systems responsibly --Microsoft On the Issues

https://blogs.microsoft.com/on-the-issues/2022/06/21/microsofts-framework-for-building-ai-systems-responsibly/

Responsible AI investments and safeguards for facial recognition | Azure Blog and Updates | Microsoft Azure
https://azure.microsoft.com/en-us/blog/responsible-ai-investments-and-safeguards-for-facial-recognition/

Microsoft to retire controversial facial recognition tool that claims to identify emotion --The Verge
https://www.theverge.com/2022/6/21/23177016/microsoft-retires-emotion-recognition-azure-ai-tool-api

On June 21, 2022, Microsoft announced ' (PDF file) Microsoft Responsible AI Standard, v2 ' that sets the guidelines for its AI ethics. In response, Microsoft's Chief Responsible AI Officer, Natasha Crampton, has announced that some features of the facial recognition service Azure Face will be restricted.

Specifically, when users use Azure Face for face recognition, they now have to tell Microsoft where and how to use this technology. However, non-hazardous uses, such as automatically blurring faces in images and videos, will remain free access.



Microsoft also has the ability to recognize emotions by excluding 'the ability to identify attributes such as gender, age, smile, beard, hair, and makeup' from the service as part of the work to align Azure Face with the new standards for AI usage. Announced that it will also stop opening to the public.

'The lack of scientific consensus on the definition of emotions and how emotional predictions are generalized to individual use cases, regions, and demographics,' Crampton said. It points to growing issues and privacy concerns about this type of feature, and anything that is alleged to infer people's emotions, whether or not they use facial analysis or other AI technologies. We have decided that we need to carefully analyze the AI system, 'he said, to address the privacy issues associated with face recognition and emotional guessing with technology.

As Crampton pointed out, Microsoft's AI technology for reading facial expressions and emotions has different facial expressions depending on the country and race, and it is inappropriate to equate external expressions with internal emotions. It has been severely criticized by experts.

Lisa Feldman Barrett, a professor of psychology at Northeastern University, told The Verge, an IT news site, 'You can detect frowns, but that doesn't mean you're angry.

' .



With this decision, Azure Face will no longer provide services that recognize attributes such as gender and age to new users on the same day, and existing users will not be able to use it after June 30th. Similarly, the voice synthesis service 'Custom Neural Voice' will also be subject to restrictions that require the use to be reported to Microsoft.

Crampton said of this ability to synthesize AI voices from recorded voices, 'it has exciting potential in education, accessibility and entertainment, but improperly disguise the speaker and deceive the listener. It's easy to imagine how it will be used. '

in Web Service,   Web Application, Posted by log1l_ks