Sorry Ladies… This is Still a MAN’S world!

Original Source Here

Bowdlerizing Bias — Equity in Gender Dynamics

Photo by That’s Her Business on Unsplash

One does not need to be exceedingly cerebral to perceive “the elephant in the server room” (D’Ignazio et al., 2020): biased AI models result from biased historical training data which results from biased human characteristics. To that end, if gender equity is to be achieved within algorithmic programs, then historical data that are representative and standardised should be used for model training purposes.

This necessitates a re-evaluation of the ways in which algorithms are developed to include thoughtful scrutiny of the data gathering process, careful curation of the data, as well as a thorough examination of human interactions, interventions and interference with training data before, during and after the deep learning process. Gebru et al., (2018) further propose that datasets be accompanied by specifications detailing their application, design, dissemination, rationale, framework, and continuation, among other things, with a view to augmenting accountability and transparency.

Bias greatly influences normative behaviour and societal principles; it must thus be eliminated in order to stymie the continued reinforcement of ideologies that serve to subjugate — social justice will never be enjoyed otherwise. Therefore, the conversation on gender bias in machine learning should be continually broadened so that best practices can be developed and features of artificial intelligence that support hegemonic power can be eliminated, in an effort to transition to a more democratic, inclusive and equitable digital economy.


BBC (2019) Apple’s ‘sexist’ credit card investigated by US regulator. Available online: [Accessed 28/12/ 2021].

Buolamwini, J. & Gebru, T. (2018). Proceedings of Machine Learning Research 81:1–15. 2018 Conference on Fairness, Accountability, and Transparency Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. New York, 23–24 February, 1–15.

Dastin, Jeffrey (2018). Amazon scraps secret AI recruiting tool that showed bias against women. Available online: [Accessed 28/12/ 2021].

D’Ignazio, C. & Klein, L. (2020). Data Feminism. Cambridge: The MIT Press.

Gebru, T., Morgenstern, J., Vecchione, B., Wortman Vaughan, J., Wallach, H., Daumé III, H. & Crawford, K. (2018). Datasheets for Datasets. Available online: [Accessed 2/1/2022].

Lambrecht, A and Tucker, C E (2019). Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads. Management Science, 65 (7). pp. 2966–2981.

Lavanchy, M (2018). Amazon’s sexist hiring algorithm could still be better than a human Expecting algorithms to perform perfectly might be asking too much of ourselves. Available online: [Accessed 28/12/ 2021].

Leavy, S. (2018) Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning Susan Leavy. Available online: [Accessed 1/1/2022].

Nasiripour, S. & Farrel, G. (2021) Goldman Cleared of Bias in New York Review of Apple Card. Available online: Accessed [1/1/2022].

Smith, G., & Rustagi, I. (2021). When Good Algorithms Go Sexist: Why and How to Advance AI Gender Equity. Stanford Social Innovation Review.

Young, E., Wajcman, J. and Sprejer, L. (2021). Where are the Women? Mapping the Gender Job Gap in AI. Policy Briefing: Full Report. The Alan Turing Institute.


Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: