Google’s Cloud AI Is Now Gender-Bias Free!

That’s the problem with Artificial Intelligence. Intelligence is impressively recreated but invoking emotions in the tool is a far cry.  

Google just announced that its ‘Google Cloud services’ that use an AI tool would no longer be labelling people based on their gender. An AI tool identifies the gender of an individual by labelling images of the person as ‘man’ or ‘woman’. While this is a training method for machine learning models, Google is doing away with it to avoid gender bias. 

In an email sent to the company’s developers, the company said it will be changing the Cloud Vision API tool. This tool uses AI to scan through images, analyse and identify faces, landmarks and anything else that is recognizable. For humans, however, the tool will not identify gender anymore, instead, it will be labelling the image as ‘person’. 

AI tools have often been criticised when used for facial recognition technology as they are not able to correctly recognize people that are transgender or out of the normal gender norms. Some tools have found it challenging to recognize people of colour as well.

There have been multiple cases of AI tools wrongly identifying people’s gender and also creating a bias once identified. Google faced major backlash in 2015 for its AI tool used in Google Photos. The tool was identifying the person’s black friends as ‘gorillas’. To avoid this, Google blocked the AI tool from identifying gorillas but that’s all that was done. 

Nevertheless, by 2018 Google released its AI principles and one of the principles focused primarily on removing bias, oversight and any other ethical issue that could arise with AI. 

googleGoogle AIGoogle Cloud
Comments (0)
Add Comment