AN INDUSTRY NORM
None of this will surprise female AI researchers familiar with the field’s legacy of objectifying women. Even before the recent generative AI boom, academics were known to test the performance of their models by using them to put makeup on images of women’s faces, or by swapping out their jeans for miniskirts, according to a recent blog post by Sasha Luccioni, a researcher at open-source AI firm Hugging Face.
Whenever she spoke up about these methods, Luccioni says she faced pushback. “It was just a benchmark after all,” she writes, pointing out that in academia, women are just as woefully underrepresented, making up 12 per cent of machine-learning researchers.
This is the kind of complex problem that takes years to solve thanks to its roots in the educational system and systemic cultural norms. But OpenAI and its peers could hinder many modern-day efforts to level the playing field, like bringing more girls and women into science, technology, engineering and mathematics (STEM) industries, if their systems continue to perpetuate stereotypes.
Rooting out the bias in the data they use to train their algorithms is one step toward fixing the problem. Another is to simply hire more female researchers to resolve the chronic imbalance. Expect their future products to be more cringeworthy – and even harmful – if they don’t.