You are currently viewing Gender Labels Removed By Google Cloud From Cloud Vision API To Avoid Bias

Gender Labels Removed By Google Cloud From Cloud Vision API To Avoid Bias

Sharing is caring!

Google Cloud AI is removing the ability to label people in images as “man” or “woman” with its Cloud Vision API. Labeling is used for classifying images and training machine learning models. However, Google is now bringing an end to the gendered labels as it violates the AI principle of Google to avoid creating biased systems.

A Google spokesperson said Google has decided to remove gendered labels given that the gender of a person cannot be inferred by appearance. It was also added that this will help to align with the artificial intelligence principles at Google, specifically Principle #2: avoid creating or reinforcing unfair bias. After today, a non-gendered label such as ‘person’ will be returned by Cloud Vision API.

Computer vision is provided by the Google Cloud API for customers to detect faces and objects. Previously, Google blocked the use of gender-based pronouns in an AI tool in 2018.

A study last fall by University of Colorado, Boulder researchers found that Artificial Intelligence from Clarifai, Microsoft, Amazon, and others maintained accuracy rates above 95 percent for cisgender men and women but misidentified trans men as women 38 percent of the time. Conversely, people with no gender identity were misidentified 100 percent of the time.

In a statement, the report’s lead author Morgan Klaus Scheuerman said that even the most up-to-date technology only views gender in two set categories. Scheuerman added while there are many different types of people out there, these systems have an extremely limited view of what gender looks like.

A spokesman from LGBT+ group Stonewall remarked it is concerning to hear that facial recognition software is misgendering trans people and added the experience of being deliberately misgendered is deeply hurtful for trans people. The spokesman added we would encourage technology developers to bring in and consult with trans communities to make sure their identity is being respected.

It was also suggested by the study that the software relies on outdated gender stereotypes in its facial analysis. Scheuerman, who is male with long hair, was categorized by the software as female half of the time.

Close Menu
× How can I help you?