Can Artificial Intelligence Help Address Gender Bias in Technology?

Gender Bias in AI
28.06.2023

Can Artificial Intelligence Help Address Gender Bias in Technology?

by Melanie King

There’s a lot of buzz about ChatGPT right now, a tool that facilitates the algorithmic delivery of data within a conversational setting. I’ve been testing out the technology and wondering where this innovation will take us, as have my colleagues.

AI is not new by any stretch, the term ‘artificial intelligence’ was coined by McCarthy et al in 1955 and the concept is believed to have been theorised as far back as 380 BC.

However, with the use cases for AI increasing rapidly, I have been contemplating how technology, and particularly AI, may contribute to endemic biases and discrimination within the industry.

 

Quick Summary:

  • AI and tech, like most facets of life - reflect societal structures and beliefs, and therefore, are influenced by a strong inherent gender bias.
  • The technology sector remains male-dominated as there are significant barriers to entry for women wanting a career in tech.
  • AI and tech products are often created with a dominant male lens, ignoring the perspective of women.
  • Data sets need to become gender responsive and inclusive.
  • Furthermore, AI could operate with a moral compass – that would be ground-breaking in addressing gender bias.
  • As consumers of tech, we need to be conscious of the inherent bias within the algorithms and information we are exposed to.
  • Marketers need to be mindful of the assumptions they make about the audience and guide clients toward the best use of AI without aiding the inherent gender bias.

 

Gender Discrimination in Technology

While mathematics is the foundation of AI and associated algorithms, the output is not necessarily objective, factual or without prejudice.

Technology is a male-dominated field. According toProgress on the Sustainable Development Goals: The Gender Snapshot 2022’ from UN Women, only two in every ten women globally work in the fields of science, technology, engineering and mathematics (STEM).

I asked ChatGPT ‘what are the best jobs for women’ and the reply I received was:

As a language model, I cannot give personal opinions, but I can provide information about careers that are commonly pursued by women or those that are well-suited to women’s strengths and interests.

The assumption that women are a homogenous group that has the same experiences, strengths and interests - pre-determined by gender, is inherently biased. 

It’s not only the gender stereotypes. Other factors such as personal literacy, access to technology and education all contribute to limitations on women and girls as they consider career paths.

The products, as a result, are overwhelmingly developed through a male lens and often ignore the needs of women. Gender bias exists in every stage of implementation, from the creation of user groups to the data sets chosen, as well as in the development of the application. It has integrated into the process, from start to finish.

Associated biases in profiling user groups used in the development of personas for technical, marketing and research applications is also problematic. It further reinforces these stereotypes. As such, any decisions based on these personas are flawed from the outset.

Have you noticed that the two most popular digital voice assistants, Alexa and Siri, are presented as female? A report by UNESCO in 2020 titled ‘Artificial Intelligence and Gender Equality’ highlights the prevalence of these tools with female names and voices, and a subservient disposition.

The gendering of digital assistants as female, rather than male or non-binary, reinforces the stereotype of females fulfilling roles that are helpful, polite and obedient. Alongside the overwhelming prevalence of males in technical roles and the barriers to entry for females in the sector, the adoption of these two assistants alone is a staggering example of the reinforcement of gender bias in technology on a mass scale.

 

Gender Discrimination in Data Sets

Algorithms require large sets of data to be collected and processed, to identify patterns and determine appropriate actions and responses. The nature of that original data is critical to the output of the algorithm.

Data is simply available information and therefore, is vulnerable to bias and stereotyping based on the assumptions of the creator. Chosen data sets need to be gender-responsive and inclusive to reflect the population it serves and avoid reinforcing biases.

Chat GPT, like many other AI platforms, uses Reinforcement Learning from Human Feedback (RLHF). Essentially this means that the technology can ‘learn’ from human feedback making the application more efficient, accurate and helpful over time.

In their research paper ‘The Capacity for Moral Self-Correction in Large Language Models’, Anthropic examines the potential for AI to ‘morally self-correct’ if it is instructed to do so. They theorise that if the appropriate programming is put in place to prevent harmful outputs, it could be a game-changer in combatting gender bias in applications.

 

Addressing Gender Bias in Technology

I asked Chat GPT what’s needed to address gender bias in AI, this is what it said:

“… it is important to promote diversity in the tech industry, including hiring and promoting more women and underrepresented groups. Collecting diverse and representative data is also essential to ensure that AI algorithms are free from bias. Additionally, developing ethical guidelines and conducting regular audits can help identify and address biases in AI algorithms, ensuring accountability and transparency.

Engaging with stakeholders, including women, underrepresented groups and marginalized communities, is also critical to ensuring that AI technologies are inclusive and equitable for all. By taking a multifaceted approach, we can work towards developing AI that promotes fairness, equality and social good.

It’s not a bad start.

UNESCO has set out a cross-disciplinary action plan to address gender bias in technology by incorporating awareness, framework, coalition and capacity building, technical assistance and funding, research, monitoring and learning. Their focus is on working to address gender bias now, while AI adoption is still growing, to make significant and lasting change.

In the meantime, how do we educate ourselves on the inherent biases we hold and those that we are exposed to, and ensure we are not contributing or perpetuating the problem?

As consumers of technology, we need to critically examine the information we are consuming and the inherent biases they contain.  Ask ourselves why we make the assumptions and judgments that we all do and where they originate.

As marketers, we need to interrogate the way in which we make assumptions about the audiences we seek to connect with. These are not only limited to gender-based judgments but also on account of race, sexuality, economic and social factors, to name a few.

As consultants, we need to guide our clients through the best use of AI for their business, while ensuring the application of technology does not contribute to the overwhelming gender-based discrimination that already exists across all facets of life.

It’s a big job - it will ‘take a village’ and the whole industry to approach the challenge and find a resolution that is easily applicable in the times to come.

Director – Digital Projects

Written by Melanie King

Senior Director - Digital Projects

Melanie brings over 20 years of digital project and marketing expertise to the Sentius team. As Director of Digital Projects, Melanie oversees digital projects such as websites, app developments, MVPs, and more, from strategy to deployment. She loves the diversity of clients and variety of projects at Sentius. Melanie's biggest career accomplishment is completing an intensive on digital strategy at Harvard Business School. "My mind was numb by the end but it was the most amazing experience," she says. When she's not strategising, Mel enjoys spending time with friends, listening to true crime podcasts, and her new found hobby of axe throwing!

Connect with Melanie on LinkedIn
Director – Digital Projects

Written by Melanie King

Senior Director - Digital Projects

Melanie brings over 20 years of digital project and marketing expertise to the Sentius team. As Director of Digital Projects, Melanie oversees digital projects such as websites, app developments, MVPs, and more, from strategy to deployment. She loves the diversity of clients and variety of projects at Sentius. Melanie's biggest career accomplishment is completing an intensive on digital strategy at Harvard Business School. "My mind was numb by the end but it was the most amazing experience," she says. When she's not strategising, Mel enjoys spending time with friends, listening to true crime podcasts, and her new found hobby of axe throwing!

Connect with Melanie on LinkedIn