Scientific Proof, Trans Is Not Really Trans

This study is drawing entirely the wrong conclusions from the research that was done. Or, perhaps to admit it, there’s a mischievous manner of interpreting these results. Which is that trans people aren’t really trans. Or, to the human or robot eye they’re not.

This would actually accord with what we know about the impact of hormones upon looks. Things like jawlines, brow ridges, the perhaps subtle underlying bits and bobs that make up “male” and “female” looks. Not because they ought to, not because it is righteous that they do, but just because that’s the way this universe works. Going through puberty with one set of raging hormones produces a subtly different facial structure than with the other.

Which is why, of course, there are such things as facial feminisation surgery and, without looking it up but being logical, the reverse.

The human face differs, in identifiable ways, dependent upon birth sex. Which is why this:

Facial recognition software is unable to recognise trans people, a university study has suggested.

The University of Colorado Boulder in the US set out to investigate the accuracy of facial analysis technology with transgender people and those who classify themselves as gender non-binary.

Researches collected almost 2,500 images of faces from Instagram which had a hashtag indicating their gender identity, including #women, #men, #transwoman, #transman, #agender, #agenderqueer, #nonbinary.

The images were then analysed by four of the largest providers of facial analysis services, IBM, Amazon, Microsoft and Calrifai.

Researchers found that on average the systems were most accurate with photos of cisgender women, getting their gender right 98.3 per cent of the time. Cisgender men were categorised accurately 97.6 per cent of the time.

However, while the facial recognition software is often accurate, the researchers found that it struggled to identify transgender people. Trans men, however, were wrongly identified as women up to 38 per cent of the time.

And those who identified as agender, genderqueer or nonbinary – people who do not identify as either male or female – were mischaracterised 100 per cent of the time.

The human eye/brain combination is really very good at a certain task – pattern recognition. Sure, it goes haywire at times, that silhouette of the candle/two faces illusion and all that. It’s also particularly good at that pattern recognition of human faces thing and for good evolutionary reason. We’re descended from people who knew whether to lash out or legover dependent upon that glance at the face.

AIs and robots are simply attempts at mechanisation of what we all already know how to do. They’re even trained by – initially at least – a human looking at the pattern and assigning an answer. Then again and again and again. We’re encoding what peeps do that is, which is the correct way to be training an AI. We do want it, after all, to be trying to describe this universe, not one in which non-existent but fashionable phantasms prevail.

So, why can’t AI recognise trans faces? Because humans don’t either. Sure, we might be polite and use Ms. or Mr. dependent upon social convention but that underlying structure rarely fools the eye. And thus the AI reports as it does.

The answer here is that trans isn’t really trans, not in facial structure. Not that this is going to be the accepted and fashionable interpretation of this research but then we knew that, right?

4
Leave a Reply

avatar
4 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
4 Comment authors
Quentin VoleMike FinnArthur the Catjgh Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
jgh
Guest
jgh

Well, as identity is on the inside, of course any examination of the outside is not going to be able to read somebody’s mind.

Arthur the Cat
Guest
Arthur the Cat

I’ve known half a dozen trans people in my lifetime, all M2F, and would say that in that case the adam’s apple and wrists are even more of a giveaway than the facial structure.

Mike Finn
Guest
Mike Finn

This is to fall for the AI bias problem. The algorithm has been trained without (many) trans faces, so it’s no surprise it gets them wrong. If you retrain it with *anything* people want to be identified as from looks then it will start to align.

Quentin Vole
Guest
Quentin Vole

I’m afraid I’m behind the curve here. I can recognise a chick with a dick at 20 paces, but is xe a #transman or #transwoman?