Gender Bias
Abstract:
Adjectives are gendered. There are certain words in our vernacular commonly associated with men, while simultaneously other words are commonly associated with women such as “catty” and “hysteria” because of its reference to the emotion.
I wanted to explore this concept in the context of machine learning. Algorithms are human fabrications of instructions created by humans for machines. Because of human intervention, there is implicit gender and unconscious bias in its context in relation to gender when using algorithms.
Experiment:
Google Images is a library curated by my machine learning algorithm and inputs from humans. For this experiment, I downloaded a library off of Google Images searching based off of the following adjectives:
- Aggressive vs. Assertive
- Bitch vs. Stern
- Boss vs. Bossy
After downloading the images, I ran my image data through a machine learning algorithm using TensorFlow to teach my machine to recognize these images. I then downloaded a library of images for “men” and “women,” merge the library and asked the machine to filter these images and assess it based on the questions “What is the most ‘Boss’”? and “What is the most ‘Bitch’”?
My results are the following:
I used the top 30 results to create a “generative” portrait of Boss and Bitch.
Bitch – Average of Top 30
Boss – Average of Top 30
Caveats:
- This is not statistically significant because of my small sample size of 100 images. I also only trained my images 100 times.
- This does not take into account covariance.
- This is not a scientific experiment because there is no controlled variable.
- Correlation does not indicate causation. This experiment is merely there to question and allow viewers to draw their own conclusion.
- The intent is to question and create dialogue rather than to draw a conclusion.
- Variables such as “Hugo Boss” affect the results of the images because the ML is also looking at the meta tags in the image files.
- The historical definition of the word “Bitch” means a female dog which influences the slang meaning of the word is associated with a female individual. This affected the results of the ML when it is processing and learning from the images. The two portraits, therefore, cannot be looked at as one project, but rather as two separate entities. There are different historical meanings behind the two words. The term “Boss” is traditionally more gender-neutral but our patriarchy traditionally places males in leadership roles.