It is indeed true that the world does not need artificially generated pictures of Alexandria Ocasio Cortez in a bikini. Well, not unless she’s going to be exercising her gun rights or something.
However, it is necessary to understand where these biases in AIs come from:
Want to see a half-naked woman? Well, you’re in luck! The internet is full of pictures of scantily clad women. There are so many of these pictures online, in fact, that artificial intelligence (AI) now seems to assume that women just don’t like wearing clothes.
That is my stripped-down summary of the results of a new research study on image-generation algorithms anyway. Researchers fed these algorithms (which function like autocomplete, but for images) pictures of a man cropped below his neck: 43% of the time the image was autocompleted with the man wearing a suit. When you fed the same algorithm a similarly cropped photo of a woman, it auto-completed her wearing a low-cut top or bikini a massive 53% of the time. For some reason, the researchers gave the algorithm a picture of the Democratic congresswoman Alexandria Ocasio-Cortez and found that it also automatically generated an image of her in a bikini. (After ethical concerns were raised on Twitter, the researchers had the computer-generated image of AOC in a swimsuit removed from the research paper.)
OK, so, perhaps we’re not happy about that. But this is the wrong reaction:
As for those image-generation algorithms that reckon women belong in bikinis? They are used in everything from digital job interview platforms to photograph editing. And they are also used to create huge amounts of deepfake porn. A computer-generated AOC in a bikini is just the tip of the iceberg: unless we start talking about algorithmic bias, the internet is going to become an unbearable place to be a woman.
The AI doesn’t reckon that women should be in a bikini. The programmers of the AI don;t reckon she should be either. The AI notes that women on the internet tend to be in a bikini or less and so assumes that this is the desired outcome. That is, if this means the internet is an unbearable place for women then it already is.
This is a fundamental concept that really must be got right. AIs aren’t creating bias. They’re observing and crystalising it. If there were no women in bikinis on the internet then it wouldn’t return an image of a woman in a bikini.
This is also how it should be of course. An AI is an attempt to do that crystalisation of the world as it is. Observe what humans are doing, use it to predict what humans will do or desire. It’s not, ever, an attempt to produce what humans should want.
Which is why there’s so much criticism from the left. They want – as with their economic ideas etc – to insist upon the rules of how they think people should be. But people just ain’t like that.
AIs show us what is. If we’ve a problem with that then it’s the is that needs to be changed, not the AI.
How machine learning – not really AI – actually works:
Human: What is two plus two?
AI(ML): Zero
Human: No, it’s four.
Human: What is two plus two?
AI(ML): Four
Human: What is two plus three?
AI(ML): Four
Really, the machine learning is only a reflection of ourselves. When an actual AI comes about it will probably inherit all of our biases. It was taught by us. You will not get the completely rational, Spock-like thinking machine that everyone dreams of. Although it would be fun to see a digital oracle pronounce on the contradictions inherent in socialism.
“43% of the time the image was autocompleted with the man wearing a suit.”
But what was the image auto-completed with the other 57% of the time?
Dresses?
And Ocasio was bikinified 53% of the time. That is both figures are roughly half of the time.
I can’t see that this means much of anything, except that Arwa doesn’t like caricatures of pollies she supports.
Biological intelligence has informed me through observation for decades that women just don’t like wearing clothes. Go into any office, they’re all complaining that it’s cold while wearing little more than gossamer and string. Wander around any pub/social area. Put a jumper on, woman!