The artificial intelligence that runs Apple’s new credit card – in association with Goldman Sachs – is apparently sexist. Good, it should be.
For the point about AIs, ‘bots and all the rest is that they’re an attempt to describe the world around us. That’s the only way they’re useful or effective, that they illuminate something of the reality surrounding us.
For example, should – just to pick an example – Burmese children be stunted through malnutrition? Nope, in fact we’d all hope rather hard that they won’t be and as soon as capitalism and free markets roll in there they’ll stop being so. But, should an AI looking at the heights of children acknowledge that Burmese children are stunted?
For we’re trying to describe reality, not what should be.
So, the Apple credit card, it gives lower credit limits to women:
As it turns out, the card seems to be as allergic to women as it is to leather. Last week, David Heinemeier Hansson, a high-profile tech entrepreneur, tweeted that the card was “sexist” because it gave him 20 times more credit than his wife – seemingly for the sole reason of his gender. Hansson’s tweet went viral and Apple’s co-founder Steve Wozniak chimed in to say that he had also been given a much higher credit limit than his wife, even though the pair have no separate cards, accounts or assets. New York regulators are now investigating the claims of discrimination.
Are those two more credit worthy than their wives? Apparently so.
As a little side issue these strong independent women do want to be treated as individuals, yes? Not merely as accoutrements that their husbands have managed to acquire along the way? Therefore they should have independent credit ratings, yes?
Reality might not be what some insist it should be. But do women, in general, have lower earning capabilities than men in our currently malformed society? Yes. So, should women have lower credit ratings? From the point of view of anyone lending money, yes, obviously. Who are the users of credit rating AIs and ‘bots? People who lend money. Therefore the AIs should be sexist about credit allocation.
Which brings us to this deeper point:
This statement seems somewhat disingenuous. Like God, algorithms often work in mysterious ways, making opaque decisions not even the program’s creators can understand. Machine-learning algorithms don’t need to be told to take factors such as gender or race into account to make sexist or racist assumptions based on the biased data fed in. And this has worrying ramifications. Increasingly, every aspect of our lives, from our employability to our trustworthiness and our creditworthiness, is being influenced by these opaque algorithms. This isn’t just a PR problem for Apple and Goldman Sachs, it’s a nightmare for us all.
No, machine learning is describing what currently is. That’s the only reason it’s useful too. Which is why the AIs should be exactly as biased and sexist as current society is. That’s actually the whole damn point of them.