By Lucie Fonseca, Global Haed, R&D, Giesecke+ Devrient
In their Global Gender Gap report, the World Economic Forum recently stated that at the current rate of progress it would take another 131 years to reach full parity (Economic Participation and Opportunity, Educational Attainment, Health and Survival, and Political Empowerment). The managing director at WEF, Saadia Zahidi, said factors that are setting women back include insufficient care infrastructure, workforce disruption from new technologies, and stagnation across sectors. The report also points out that increasing women’s economic participation and achieving gender parity in leadership, in both business and government, are two key levers for addressing broader gender gaps in households, societies and economies.
What is true of the real world, is also true of the digital world. Cyberspaces are here to stay. Unless you live in remote wilderness and hunt your own food, it’s becoming increasingly impossible to live off the grid, without social media, e-commerce, digital payments, chat bots and generally access to the internet. Women’s basic needs in the digital spaces and physical spaces are essentially the same: fairness, safety, accessibility.
In the physical world there are many examples of objects, services and spaces that are falling short of meeting those needs because they were not designed with women in mind.
• Working suits that do not fit women bodies, for oil & gas workers, construction and even astronauts
• Description of heart-attack symptoms that miss the specifics of hear-attacks in women resulting in wrong diagnosis and fatal outcomes
• Public spaces designed for boys and men and overwhelmingly occupied by boys and men, including school yards
• Bathrooms in movie theatres, festivals or airports with much longer lines on the women side
• Microphones and other sound instruments are designed for male physiology and do a poor job at rendering female voices and make women sound shrill
The Digital World is NO Different
• Voice recognition in cars that does a poor job at recognizing female voices and automakers suggesting that women learn to speak louder and direct their voices towards the microphone, something men don’t have to do
• Insults, harassment and even virtual assault of female players in video games pushing many women players to pretend being male to avoid violence online
• Effective silencing of dissenting female voices on social media (doxing, raids, taking down content through aggressive reporting campaigns etc.)
• Lack of access to technology – or digital divide, particularly affecting women in developing countries, and preventing access to precious resources such as online training, financial services. The high price point of certain equipment is an aggravating factor, see for example the recently launched Apple vision pro headset
• Biased artificial intelligence algorithms that effectively make sexist decisions for example in hiring or credit rating
Unless we intervene, the algorithms and AI models that underly our technology have multiple ways of incorporating bias: 1) from biased data sets 2) from conscious or unconscious bias of human being labelling the data 3) biased models.
This is particularly concerning if we think of a world in the relatively near future where digital means, algorithms and artificial intelligence may be involved in a wide range of critical decisions with life-altering potential: law enforcement, access to credit or funding, healthcare, education, employment, social services etc. We have a moral obligation to ensure fairness, accessibility, and safety in these technologies. Not only can models have built-in biases, but they can also amplify existing stereotypes. While this article focuses on gender biases, the same could be told of other biases such as age, race, gender orientation, disability etc. There are abundant examples of technology biased all kind of minorities both in the physical and digital world.
Why & How for the Imperative Intervention
• In training data sets or samples. If all images of doctors are men and all images of nurses are women, it is not a surprise if the AI assumes that only men can be doctors and only women can be nurses
• In teams working on tools, software, algorithms. Currently women are still under-represented in technology, particularly at senior decision-making levels and in the most recent fields like artificial intelligence.
• Systematic testing of output across different demographics and their intersection (e.g. gender, race, age etc.)
• Creating virtual safe spaces for minorities. These exist in the physical world as well, for example feminist spaces reserved for women and ensure participants can exchange without interference and gender dynamics
• Educating users about online safety, how to react or report when victim or witness of online abuse and violence.
• Hold perpetrators accountable. Currently online violence is too often brushed under the carpet as “just virtual, not real”, even though the consequences on mental health and physical safety are often very real.
• Use technology to prevent harassment, track any suspicious activity and crack down on violent or discriminating behaviours.
Keep it Accessible:
• Factor in various accessibility angles when designing digital solutions: age, different disabilities, wealth, literacy etc.
• Regulation mandating accessibility of all digital services
If we neglect diversity from the start, we face the risk of creating a dystopian cyber-verse where women and other minorities will keep on struggling. If we do it right, and build in diversity, equity and inclusion from the get go, we can collectively rip the immense benefits of the ‘phygital world’ where the right combination of physical and digital tools and spaces give us access to a better, easier, and safer life.