UW doctoral student Matt Kay, left, and Assistant Professor of human centered design and engineering Sean Munson, right, recently released a study finding that Google search images significantly underrepresent woman in certain professional positions.

UW doctoral student Matt Kay, left, and Assistant Professor of human centered design and engineering Sean Munson, right, recently released a study finding that Google search images significantly underrepresent woman in certain professional positions.

Take a moment to do a Google search for an image of a CEO. Four rows of men, mostly white, will appear. On the fifth row, an image of CEO Barbie, then more men. 

This set of images is not an accurate portrayal of the American CEO demographic. Women occupy 27 percent of CEO positions compared to the mere 11 percent depicted in a Google image search.  

In a recent study, a small team of researchers from the UW and the University of Maryland highlighted and analyzed this gender bias in online image search results. 

The team, made up of UW computer science and engineering Ph.D. student Matthew Kay, University of Maryland assistant professor of computer science and electrical engineering Cynthia Matuszek, and UW assistant professor in human centered design and engineering Sean Munson, began work on this study in 2012.

Interestingly, Munson said he developed the idea for the study while watching a presentation on robot babies.

“The person who was presenting was talking about caregivers and put up a very stereotypical image of a woman in her 40s,” Munson said. “Then, whenever they talked about the actual results of the study they would show images of male grad students because that was who was actually running the study.”

When participants in the study were asked to identify which images portrayed a more professional and appropriate looking person for a given occupation they didn’t tend to exhibit blatant sexism in their choices. To the surprise of the researchers, participants more often clung to stereotypes. In other words, when an image matched the stereotype of the profession it was more likely to be selected by the participant. 

Still, the study proved a systematic underrepresentation of women in image search results. According to the study, an occupation equally populated with women and men, would have only 45 percent women in the image search. 

Often a portion of the women represented were highly sexualized. Munson, Kay, and Matuszek call this the “sexy construction worker problem.” 

Looking for an image of a female construction worker? Search for one. You will see a cartoon male construction worker before coming across the first woman, who is topless with a piece of construction tape across her breasts labeled “construction area.” 

“The women in a male-dominated field might appear to be less professional in scantily clad Halloween costumes,” Munson said. 

In reality, there are fewer female construction workers than male, so more often an image of a male construction will be selected. The opposite can be said for nurses, a field heavily populated with women. Since the three began their research, Getty Images, in partnership with Sheryl Sandberg, CEO of Facebook, released a set of images that more accurately portray women.

Both Sandberg and the UW team hope their work will encourage search engines to recognize their algorithms are promoting negative stereotypes and that search results can mold a person’s perception of a certain occupation.

“A lot of groups have the ability to be part of the solution and so can we when we choose which images we select,” Munson said. “The algorithms take into account what we choose when they choose what images to show the next person.”

 

Reach News Editor Kate Clark at science@dailyuw.comTwitter: @KateClarkUW

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.