Microsoft to reduce some of its facial recognition capabilities in the name of “responsible AI”


Credit: Microsoft

Microsoft is removing some features from its AI service to ensure its facial recognition technology meets ethical AI guidelines. Microsoft plans to reduce some facial recognition capabilities for new users starting this week and for existing users within a year.

Starting today, June 21, new customers will need to request access to use facial recognition operations in Azure Face, Computer Vision, and Video Indexer application programming interface, according to a new blog post. blog, and existing customers have one year to apply and get approved for continued access to these services based on their use cases. Microsoft officials said they believe limiting access would add an extra layer of control for its facial recognition services that would team up with the company’s services. Responsible AI Standard. Effective June 30, 2023, existing customers will no longer be able to access facial recognition capabilities if their request has not been approved.

Some facial recognition technologies, such as blur, exposure, glasses, head pose, landmarks, noise, occlusion, and face contour detection, do not require recognition. ‘application.

Microsoft officials also said they are getting rid of facial analysis capabilities designed to infer emotional states and attributes such as gender, age, smile, facial hair, hair and makeup. They said these types of capabilities raise questions about confidentiality, the lack of agreed-upon definitions, and the inability to generalize connections. These types of features could be more easily misused and create stereotyping, discriminatory issues and unfair denial of service, officials said. These features are no longer available to new customers starting June 21 of this year and will no longer be available to existing users after June 30, 2023. Microsoft will continue to allow some of these technologies to be available through integration into certain services for people with disabilities. like its Seeing AI app.

Microsoft advises customers to use tools and resources like the open source Fairlearn package and Microsoft’s Fairness Dashboard to understand how Azure Cognitive Services can best be used fairly. Microsoft also has a new recognition quality programming interface to flag potential lighting, blurring and other image quality issues that may affect certain demographics more than others, researchers said. responsible.

Source link

Denial of responsibility! is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – The content will be deleted within 24 hours.

Similar Articles

Most Popular