As The Police Are Finding Out, Facial Recognition Isn’t Very Good Yet – 80% Failure Rate


This brave new world where we are all surveyed from remote cameras is taking longer to arrive than many thought it would. The reason should be obvious enough. We humans are intensely interested in being able to recognise other humans beings. Maybe they’re someone about to attack us, maybe they’re about to offer us food, or sex, or summat. We’re really very interested indeed and we’re descended from hundreds of thousands of years worth of those who were also very interested. Thus we’re good at it.

Machines not so much:

Facial recognition technology used by Scotland Yard is wrong in the vast majority of cases and probably illegal, according to the first independent analysis of the system. Scotland Yard has been trialling Live Facial Recognition technology, in which cameras scan the faces of members of the public to compare them with faces on a list of wanted individuals. However, researchers from the University of Essex who were given access to six of ten trials in Soho, Romford and at the Westfield shopping centre in east London, found that the technology picked out faces that were not on a wanted list in 80 per cent of cases.  

It could be that not all scrotes are in fact on a wanted list but to insist upon that would be to fall into flippancy. The correct answer is just that we’re still in the early days of this technology. Google’s image search, as is well known, has great difficulty in distinguishing between a darker skin and one of our ape evolutionary cousins. This is not a great advertisement for the detailed accuracy of facial recognition:

The force maintains its technology only makes a mistake in one in 1,000 cases – but it uses a different measurement to arrive at this conclusion. The report, exclusively revealed by Sky News and The Guardian, raises “significant concerns” about Scotland Yard’s use of the technology, and calls for the facial recognition programme to be halted.

That’s really very silly as an insistence though. As long as we know how inaccurate the system is we can live with it. Anything that’s wrong 80% of the time isn’t going to get used in court as evidence – not with anything like a decent lawyer it’s not anyway.

To halt it all now would though, stop all development. It would be like stopping cars when Daimler himself was just polishing his first bumper. Technologies just do take time to mature into actually being useful.

The Neoface system used by the Met and South Wales police is supplied by Japanese company NEC, which markets the same technology to retailers and casinos to spot regular customers, and to stadium and concert operators to scan crowds for “potential troublemakers”. Scotland Yard insisted its deployments were legal and successful in identifying wanted offenders, and that the public would expect it to trial emerging technology.

Yes, quite, it’s got to be tried out otherwise it never will get any better.

There is a different area where we really should have some concern though. The systems used to check passport photos aren’t all that much better, despite being in rather more controlled circumstances. We should, of course, still be using so as to experiment with such systems but relying on them is a bit premature.