Realist, not conformist analysis of the latest financial, business and political news

Come And See The Bias Inherent In The System

There’s a logical failing in the current preoccupation with AI, ‘bots and the possibility that they contain bias. Because, of course, the very thing we’re trying to encode is bias – that’s the point and purpose.

Take a general view of the world. We’re going to automate some set of human decisions. You can borrow money, you can’t, say. OK. Does the real world out there offer us some evidence of who may usefully borrow and who may not? Are we being biased or sensible when we deny Dominic Chappell a few hundred million? Or, changing the price of car insurance by postcode?

The entire point of a ‘bot is to try and encode the things we already know. The point of AI is to uncover things in the data that are true but that we don’t as yet know. In both cases so that we can apply the correct biases to our decision making.

At which point we’ve some dingbat at the UN:

Robotic lie detector tests at European airports, eye scans for refugees and voice-imprinting software for use in asylum applications are among new technologies flagged as “troubling” in a UN report.

The UN’s special rapporteur on racism, racial discrimination, xenophobia and related intolerance, Prof Tendayi Achiume, said digital technologies can be unfair and regularly breach human rights. In her new report, she has called for a moratorium on the use of certain surveillance technologies.

Achiume, who is from Zambia, told the Guardian she was concerned about the rights of displaced people being compromised. She said there was a misconception that such technologies, often considered “a humane” option in border enforcement, are without bias.

“One of the key messages of the report is that we have to be paying very close attention to the disparate impact of this technology and not just assuming that because it’s technology, it’s going to be fair or be neutral or objective in some way.”

Border control technologies are biased because the very point of a border is bias. These peeps can come in, these can’t. That’s bias. So, why’s she complaining?

0 0 votes
Article Rating
Total
0
Shares
Subscribe
Notify of
guest

7 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Ltw
Ltw
3 years ago

I could understand a problem with false positives barring/hindering people with a right to be entering. But Prof Achiumes’s problem seems to be with true positives. So her complaint is about racial profiling updated with new technology.

Spike
Spike
3 years ago

Hatred for dark skin ought to be resisted. “Disparate impact” means measuring the success rate of people with dark skin and, if disadvantageous, concluding that it is essentially hatred. Likewise Kamala Harris’s vile pre-election video insisting we should all “end up at the same place.” Prof. Achiume explicitly does not want software that is “fair or neutral or objective.”

John Galt
3 years ago

Border control technologies are biased because the very point of a border is bias. These peeps can come in, these can’t. That’s bias. So, why’s she complaining? “‘cos computa says no iz wacist, init?”. In fairness, if you’re going to call the diverse officers of the UK Border Force “racists” when they validly point out that you are a foreigner without a valid visa who has no right to enter the UK then you’re going to say the same of a computer. Not that it makes any sense, but that seldom seems to matter in these days when accusations of… Read more »

Boganboy
Boganboy
3 years ago

One assumes she simply means that black privilege over-rules any preference at all of the wicked white Westerners.

James
James
3 years ago

The problem is that the literal point of equality legislation is to make statistical discrimination illegal. Statistics tell us that women live longer than men but the law requires that you charge them the same insurance premium for pensions.

A properly functioning AI program will give you an answer that matches reality rather than one that conforms to the law.

It is very difficult to fix this problem because businesses want economic optimisation and the equality industry want cross subsidy.

NDReader
NDReader
3 years ago

“The entire point of a ‘bot is to try and encode the things we already know. ”
I think that’s ‘knowledge based systems’.
But I want more from that from AI – I want AI to learn and use the stuff we don’t already know, or that we don’t know that we know, or that we dare not tell it.
Again, though, AI is going to be blamed for knowing stuff.

Quentin Vole
Quentin Vole
3 years ago
Reply to  NDReader

That might be nice (or it might produce a dystopia), but there’s no chance of it occurring during the next few decades, at least. Artificial General Intelligence has proved so difficult (we’re basically no nearer than in the 60s) that it’s been abandoned in favour of machine learning and superior solutions of the ‘travelling salesman problem’.

7
0
Would love your thoughts, please comment.x
()
x