Ultimate magazine theme for WordPress.

Building AI without women will lead to biased results: Microsoft

0 320

As artificial intelligence (AI) becomes the buzz of the town, building AI-based solutions without the inclusion of women would give way to a technology that is inherently biased, a top Microsoft executive has said. According to the “World Economic Report 2018”, only 22 per cent of AI professionals globally are female, while almost a third (32 per cent) believe that gender bias is still a major hurdle in the recruitment process in the industry.

“If AI systems are built only by one representative group such as all male, all Asian or all Caucasian, then they are more likely to create biased results,” said Mythreyee Ganapathy, Director – Programme Management, Cloud and Enterprise, Microsoft.

Data sets that will be used to train AI models need to be assembled by a diverse group of data engineers.

“A simple example is data sets that are used to train speech AI models which focus primarily on adult speech samples unintentionally exclude children and hence the models are unable to recognise children’s voices,” Ganapathy added.

India is at the 108th spot in the gender gap index, according to the “World Economic Forum 2018” report. It also has one of the lowest participation rates of women in the labour market at 27 per cent. A different set of people should be included to increase the diversity of AI teams as more than half (52 per cent) women globally, perceive the tech sector to be a “male” industry, the report adds.

To balance the gender gap in the country, the tech giant promotes the study of computer science at traditionally female colleges and other universities.

“We believe that attracting, developing and helping women in STEM fields is vital to ensuring a well-rounded, inclusive society without which we risk having hundreds of thousands of jobs left unfilled and decades of innovation absent of female perspectives,” the Microsoft executive noted.

Corporate and academic AI teams have inadvertently made systems biased against women. For example, tech giant Amazon’s ML experts scrapped a “sexist” AI recruiting tool in October 2018 after they discovered the recruiting engine “did not like women”.

Members of the team working on the system said it effectively taught itself that male candidates were preferable.


If you have an interesting article / experience / case study to share, please get in touch with us at editors@expresscomputeronline.com

Advertisement

Get real time updates directly on you device, subscribe now.

Subscribe to our newsletter
Sign up here to get the latest news, updates delivered directly to your inbox.
You can unsubscribe at any time

Leave A Reply

Your email address will not be published.

Join Our Newsletter Today!

Stay updated with all latest updates from the world of Business Technology, get exclusive invites to our upcoming events & much more.
Subscribe Now!
SUBSCRIBE NOW
close-link