Tech Is Biased, Because Humans Are.

Original Source Here

Tech Is Biased, Because Humans Are.

Bad news: Bias slow down social innovation. Good news: We can fix it.

Photo by Michael Dziedzic on Unsplash

PART 1 — A while ago, during Women’s History Month and on the occasion of the cycle “Empowering women in the workplace” hosted by Aticco Coworking, I have listened with great interest to various talks on the state of women in technology.

Laura Fernández, CEO & Co-founder of Allwomen, gave an engaging talk entitled “The women-in-tech movement as an accelerator of social change” and I have had the chance to wrap up some thoughts and chat with her about this topic. What you will find in this two-part article are some impressions on the talk as well as the current state of technology (part 1) and a very special interview with Laura (part 2) which is coming out soon. Stay tuned!

Reflecting on the state of diversity in tech

Steve Jobs and the Macintosh in 1984 by Diana Walker

Let’s do a little exercise. Close your eyes and think of a human being working in technology. What comes to your mind? It is most likely that your brain has made you think of a white, middle-aged man. Am I right?

According to AllWomen‘s CEO Laura Fernández, there is a simple explanation for this behaviour: the world we live in is completely biased.

Bias is a disproportionate weight in favor of or against an idea or thing, usually in a way that is closed-minded, prejudicial, or unfair. — Wikipedia

The reality in which we live is so infused with stereotypes that we don’t even realise that we are completely — and in some cases, unintentionally — surrounded by them.

Gender bias start at playtime

Photo by Raj Rana on Unsplash

Stereotypes are something that people often and mistakenly consider “the norm”. One of the implications of this passive behaviour is that, by doing so, we unintentionally reinforce them and fuel the status quo. An example? Some jobs are “male” and others are “female”.

This fuels the mentality that a career choice should be conditioned by your gender. These limiting and standardising tendencies have long been socially accepted, no matter how superficial they are — and if you identify yourself as a women, it is probable that this is no news to you.

Male” and “female” occupations and the so-called gendered-jobs are dangerous not only for the present generations, but also — and especially — for the future ones.

Children may erroneously learn, from a very young age, that finance professionals are men while healthcare professionals are women.

Engineers are men while teachers are women. Police officers are men while HR professionals are women, and so on. Also, by the age of 6, girls consider boys smarter than children of their own gender, according to a US study.

The study assumes that these kind of stereotypes are absorbed at a very young age and ultimately discourage adult women from entering professions that require “special” mental abilities. As a consequence, women are underrepresented in fields such as physics, science or philosophy.

If we consider the STEM sector, there are very few and well-known female role models. As a result, girls grow up without an inspiration to follow, and it is undoubtedly difficult to become something that you have never seen.

Let’s talk numbers:

  • Less than 25% of global STEM graduates are female.
  • Women only hold 1/4 of all tech jobs.
  • Tech start-ups owned by women are 5%.
  • Only 11% of executive positions in Silicon Valley companies are held by women.
  • The turnover rate is more than twice as high for women than it is for men in tech industry jobs — 41% versus 17%. 56% of women in tech are leaving their employers mid-career.
  • The median male is paid 61% more than the median female.

The stats are staggering and clearly show that we are not there yet.

Artificial intelligence reflects the unfair world we live in

AI in Robotics on Unsplash

In technology, the focus often is on creating products that solve problems and meet the needs of end consumers. However, without a proper context, without diversity, and without inclusion, it is difficult to create effective solutions that work on a large scale. As a result, products do not meet the needs of users and we reinforce an inadequate environment without inclusive alternatives.

Even a big tech giant like Google faced some criticism. There are several well-known examples of biased and discriminatory Google Images search results: when searching a picture of a doctor, almost all white men appear. With the term “CEO” the results are the same — mostly male and caucasian people — while if we type “secretary” the algorithm mainly returns images of white women. This reflects what previously stated: men are leaders, entrepreneurs. Women are assistants, caregivers. This is just a mirror reflecting the social issues and problems of our times.

Let’s try with a different query — Hollywood actors. Things don’t get better, on the contrary, Google’s algorithm gets even more biased. If we search for a male actor, the algorithm will mostly focus on his career. If we search for a female actor, Google’s algorithm will focus on her body — thus reinforcing sexist cliches.

Google claims to be impartial. But without a human oversight of how its algorithm works, this impartiality is hard to grasp.

Racism is coded into tech products

Photo by Rolande PG on Unsplash

When we talk about diversity, we do not only refer to gender issues. The racial discrimination and bias practiced against non-white people and the BIPOC community is very present in technology.

An easy and sadly well-known example, also mentioned by Laura during her talk, is the automatic soap dispenser unable to detect a black customer’s hand. It may seem hilarious at first, but is clear proof of a major and endemic issue present in many technology-based companies: the lack of diversity.

If the majority of people in a company are white, it is most likely that the products developed in this company are likely to present some kind of inherent discrimination. The less diversity a workplace environment presents, the more likely design flaws will be biased against minorities.

Another common example are facial-recognition systems, which in the US are more likely either to misidentify or fail to identify African Americans than white people.

This suggests that the conditions in which an algorithm is created can strongly influence the accuracy of its results.

If it’s neutral, it’s not technology

Photo by Luca Bravo on Unsplash

To sum up, it is worth remembering that technology itself is neither sexist nor racist. Technology is just a tool and, as such, it does not have a will of its own.

However, it is impossible to think of technology as something neutral. Any technology has been designed by individuals: human beings who live in societies, with all their cultural frameworks, realities and opinions. And biases.

Tech development cannot be separated from the context — made of individuals and their milieu — in which it is produced. As a consequence, technology becomes discriminatory when there is no diversity in voices and opinions within its creator team.

The solution? We need to take responsibility for our biases and act consequently.

Education helps creating better futures

Photo by Christina Morillo on Unsplash

At the end of her talk, Laura tells us about the mission behind Allwomen academy: to improve the current state of technology and ultimately offer a better, more balanced and equitable future to all the individuals who identify themselves as women and work — or aspire to do so — in technology related fields.

Allwomen has been created with the mission to change the reality, because things can be different and better and the future of tech can be female, too. Quoting Laura, this academy is a place where students can realise and develop their potential, always with a global and inclusive view.

Social change involves an alteration of the order of a society. It is encouraging to know that Allwomen’s vision is to create a community able to design inclusive technology — and solve society’s problems instead of reinforcing them.


Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: