Facial recognition software delivers farcical results

The information Commissioner has expressed grave doubts about the efficiency of facial recongition software which has failed miserably in tests by flagging up a 'staggering number' of innocent people according to campaign group Big Brother Watch.

South Wales Police tested a system making 2,685 'matches' for known offenders in their database but 2,541 of them were incorrect - over 94%. When the Metropolitan Police Service used the technology at the Notting Hill Carnival in 2016 and 2017 it incorrectly identified 102 people as potential suspects. Leicester Police Force abadoned the use of the technology in 2015.

Some forces have defended the system claiming that there are other checks and balances including human intervention that prevents false positives from resulting in further action but Big Brother Watch is concerned that sometimes these images of innocent people are stored on databases for an unknown length of time which would contravene the GDPR regulations about to be introduced later this month.

However, Information Commissioner Elizabeth Denham said police had to demonstrate that facial recognition is"effective" and that no less intrusive methods are available to achieve the same result saying: "Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public".

This also calls into question the business model of the Facewatch company which promotes the use of its facial recognition system in private premises. The company's website boasts that theirs is the only shared national facial recognition watchlist offering subscribers full access their their database of 'persons of interest' which is shared by geography with other subscribers in the same area. But if the accuracy of their system is not better than that employed by the police and the 'sharing' of images is not deemed to be proportionate they too could fall foul of the Information Commissioner.