This study is drawing entirely the wrong conclusions from the research that was done. Or, perhaps to admit it, there’s a mischievous manner of interpreting these results. Which is that trans people aren’t really trans. Or, to the human or robot eye they’re not.
This would actually accord with what we know about the impact of hormones upon looks. Things like jawlines, brow ridges, the perhaps subtle underlying bits and bobs that make up “male” and “female” looks. Not because they ought to, not because it is righteous that they do, but just because that’s the way this universe works. Going through puberty with one set of raging hormones produces a subtly different facial structure than with the other.
Which is why, of course, there are such things as facial feminisation surgery and, without looking it up but being logical, the reverse.
The human face differs, in identifiable ways, dependent upon birth sex. Which is why this:
Facial recognition software is unable to recognise trans people, a university study has suggested.
The University of Colorado Boulder in the US set out to investigate the accuracy of facial analysis technology with transgender people and those who classify themselves as gender non-binary.
Researches collected almost 2,500 images of faces from Instagram which had a hashtag indicating their gender identity, including #women, #men, #transwoman, #transman, #agender, #agenderqueer, #nonbinary.
The images were then analysed by four of the largest providers of facial analysis services, IBM, Amazon, Microsoft and Calrifai.
Researchers found that on average the systems were most accurate with photos of cisgender women, getting their gender right 98.3 per cent of the time. Cisgender men were categorised accurately 97.6 per cent of the time.
However, while the facial recognition software is often accurate, the researchers found that it struggled to identify transgender people. Trans men, however, were wrongly identified as women up to 38 per cent of the time.
And those who identified as agender, genderqueer or nonbinary – people who do not identify as either male or female – were mischaracterised 100 per cent of the time.
The human eye/brain combination is really very good at a certain task – pattern recognition. Sure, it goes haywire at times, that silhouette of the candle/two faces illusion and all that. It’s also particularly good at that pattern recognition of human faces thing and for good evolutionary reason. We’re descended from people who knew whether to lash out or legover dependent upon that glance at the face.
AIs and robots are simply attempts at mechanisation of what we all already know how to do. They’re even trained by – initially at least – a human looking at the pattern and assigning an answer. Then again and again and again. We’re encoding what peeps do that is, which is the correct way to be training an AI. We do want it, after all, to be trying to describe this universe, not one in which non-existent but fashionable phantasms prevail.
So, why can’t AI recognise trans faces? Because humans don’t either. Sure, we might be polite and use Ms. or Mr. dependent upon social convention but that underlying structure rarely fools the eye. And thus the AI reports as it does.
The answer here is that trans isn’t really trans, not in facial structure. Not that this is going to be the accepted and fashionable interpretation of this research but then we knew that, right?