Member-only story
AI has a problem with women’s bodies
A reflection of our society’s impossible beauty standards
AI does not like women very much.
I’m not writing anything new. I do bring visuals. Skeptics always need proof.
Technology should make our lives easier. If there is something about it that is harming over half the population, well, we should discuss it.
Bias in AI algorithms
The Guardian published an investigative piece about AI algorithms and how they, “rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved.”
How and who trains AI tools matters. How and who defines algorithms matters.
If only a specific type of people, such as straight white men, code and train artificial intelligence, it makes sense that their views influence the tools we use.
No, not ALL white men are erasing women’s experiences on purpose. Yet, even the most self-aware person can’t escape their biases.
Women’s perspective is not included
In a Forbes article, Carmen Niethammer explains how U.S. military personnel inform many medical algorithms. In some areas, women only represented six percent. As a result, mobile applications use majority-based male data to develop their algorithms.
“There are huge data gaps regarding the lives and bodies of women,” finds Prof. Dr. Sylvia Thun, director of eHealth at Charité of the Berlin Institute of Health.
It wouldn’t be the first time research focuses on men only. Throughout history, several fields of study, government, and more have excluded the female perspective.
For instance, urbanism doesn’t consider the needs of women or children. Cities were not designed with women in mind.
Scientists and doctors did not know how the female body worked. Most clinical trials are tested in men. Even believing us when we say we are in pain is a challenge. Many healthcare professionals continue to dismiss our experience.
But, AI is a different monster altogether.