IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image

IWD 2023: Can AI help to address gender bias in technology?

Wed, 8th Mar 2023
FYI, this story is more than a year old

There's a lot of buzz about ChatGPT right now, a tool that facilitates the algorithmic delivery of data within a conversational setting. I've been testing out the technology and wondering where this innovation will take us, as have my colleagues.

Artificial Intelligence, or AI, isn't new by any stretch: the term 'artificial intelligence' was coined by McCarthy et al in 1955, and the concept is believed to have been theorised as far back as 380 BC.

However, with the adoption of AI increasing at a fast pace, I've been contemplating how technology, and particularly AI, may be contributing to endemic biases and discrimination.

Gender discrimination in technology

While mathematics is the foundation of AI and associated algorithms, the output is not necessarily objective, factual or without prejudice.

Technology is a male-dominated field. According to 'Progress on the Sustainable Development Goals: The Gender Snapshot 2022' from UN Women, only two in every ten women globally work in the fields of science, technology, engineering and mathematics (STEM).

I asked ChatGPT' what are the best jobs for women' and the reply I received was:

"As a language model, I cannot give personal opinions, but I can provide information about careers that are commonly pursued by women or those that are well-suited to women's strengths and interests."

The assumption that women are a homogenous group that have the same experiences, strengths and interests, and these are pre-determined by gender, is inherently biased.

It's not only gender stereotypes. Other factors such as personal literacy, access to technology and education all contribute to limitations on women and girls as they consider career paths.

The result is that the products are overwhelmingly developed within a male lens and often ignore the needs of women. Gender bias is in every stage of development, from the creation of user groups to the data sets chosen and the development of the application. In short, it infects the process from start to finish.

Associated biases in profiling user groups used in the development of personas for technical, marketing and research applications is also problematic and reinforces these stereotypes. As such, any decisions based on these personas is flawed from the outset.

Have you noticed that the two most popular digital voice assistants, Alexa and Siri, are presented as female? A report by UNESCO in 2020 titled 'Artificial Intelligence and Gender Equality' highlights the prevalence of these tools with female names and voices and a subservient disposition. Their adoption is a staggering example of the reinforcement of gender-bias in technology on a mass scale.

Gender discrimination in data sets

Algorithms require large sets of data to be collected and processed to identify patterns and determine appropriate actions and responses. The nature of that original data is critical to the output of the algorithm.

Data is simply available information and, as such, is vulnerable to bias and stereotyping based on the assumptions of the creator. Chosen data sets need to be gender-responsive and inclusive to reflect the population it serves and avoid reinforcing biases.

Chat GPT, like many other AI platforms, uses Reinforcement Learning from Human Feedback (RLHF). Essentially this means that the technology can 'learn' from human feedback making the application more efficient, accurate, and helpful over time.

In their research paper 'The Capacity for Moral Self-Correction in Large Language Models', Anthropic examines the potential for AI to 'morally self-correct' if it is instructed to do so. They theorise that if the appropriate programming is put in place to prevent harmful outputs, it could be a game-changer in combatting gender bias in applications.

Addressing gender bias in technology

I asked Chat GPT what's needed to address gender bias in AI, this is what it said:

"… it is important to promote diversity in the tech industry, including hiring and promoting more women and underrepresented groups. Collecting diverse and representative data is also essential to ensure that AI algorithms are free from bias. Additionally, developing ethical guidelines and conducting regular audits can help identify and address biases in AI algorithms, ensuring accountability and transparency.

"Engaging with stakeholders, including women, underrepresented groups, and marginalised communities, is also critical to ensuring that AI technologies are inclusive and equitable for all. By taking a multifaceted approach, we can work towards developing AI that promotes fairness, equality, and social good."

It's not a bad start.

UNESCO has set out a cross-disciplinary action plan to address gender bias in technology, incorporating awareness, framework, coalition building, capacity building, technical assistance and funding, and research, monitoring and learning. Their focus is on working to address gender bias now, while AI adoption is still growing, to make significant and lasting change.

In the meantime, how do we educate ourselves on the inherent biases we hold and those that we are exposed to and ensure we are not contributing to or perpetuating the problem?

As consumers of technology, we need to critically examine the information we are consuming and the inherent biases it contains. Ask ourselves why we make the assumptions and judgements that we all do and where they originate.

As marketers, we need to interrogate the way in which we make assumptions about the audiences we seek to connect with. These are not only limited to gender-based judgements but also on account of race, sexuality, and economic and social factors, to name a few.

As consultants, we need to guide our clients through the best use of AI for their business while ensuring the application of technology does not contribute to the overwhelming gender-based discrimination that already exists across all facets of life.

It's a big job - it will 'take a village' and a whole of industry approach.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X