Amit Shah said 'facial recognition' identified 1,900 Delhi rioters. How does it work?

The facial recognition accuracy is a very worrying factor

amit-shah-delhi-pti Union Home Minister Amit Shah speaks at a foundation stone laying ceremony, in New Delhi | PTI

In a Rajya Sabha debate on Thursday, Home Minister Amit Shah asserted that those responsible for the recent riots in Delhi will be brought to book irrespective of their caste, religion and political affiliations. He said over 700 FIRs have been registered and more than 2,600 people arrested based on evidence. "Those who are responsible for the violence and those who conspired to trigger the riots will be punished irrespective of their caste, religious and political affiliations." Shah said 1,922 faces have been identified using facial identification software. 

However, soon after his statement, a controversy broke over social media. Some claimed that Aadhaar data was being used in the facial identification software to crack down on the rioters in the street. However, Shah said no Supreme Court guideline on privacy has been violated and no Aadhaar data has been used. He said a detailed scrutiny of videos of the riots is being done, and only driving licence, voter ID data were being used in the software. 

How does Facial Recognition software work?

In 2018, the home ministry had announced the Automated Facial Recognition System (AFRS) project to help the law enforcement agencies to identify criminals, missing people and unidentified bodies in a scientific and speedy manner. The AFRS, under the aegis of the National Crime Records Bureau (NCRB), is a component of Crime and Criminal Tracking Network and Systems (CCTNS), a national database of crimes and criminals. The software would be used only in respect of such persons who figured in the CCTNS data base—accused persons, prisoners, missing persons and unidentified found persons including children, and unidentified dead persons and is not going to be used on any other database, the ministry had earlier said.

Just as fingerprint matching is used in investigation by the police by matching fingerprint found in crime scene with the fingerprint database, the AFRS will add another information layer to investigation by allowing matching photograph of suspect or missing person with the photo database of CCTNS. 

Tarun Wig, founder of INNEFU Labs which built the system now being implemented by the Delhi police, said, “What AFRS does is detect and extract the faces out of an image. Every face is then converted to a vector of 512 values. Then, the software calculates the shortest distance between two vectors in a chosen database, and the closest matches are typically the accurate recognition of a face.” 

Worrying accuracy statistics

In 2018, before the Delhi High Court, Delhi police counsel Rahul Mehra had held that the accuracy of the facial recognition software—then used to trace women who went missing from an illegal placement agency—was only 2 per cent and "not good". In 2019, the Ministry of Woman and Child Development (MWCD), represented by the central government's standing counsel Ripudaman Singh Bhardwaj, told the court that the current version of the FRS was so poor it sometimes matched pictures of missing boys with that of girls, with accuracy of less than 1 per cent.

Questions of privacy

Advocacy organisations like the Internet Freedom Foundation (IFF) have questioned the legal authority under which this facial recognition system is being implemented. They claimed there was scant regard to the Supreme Court's nine-judge bench judgement in case of KS Puttaswamy vs Union of India (2017) that reaffirmed the fundamental right to privacy. 

The organisation said there was no legality to establishing the AFRS. "As per orders of the Delhi High Court in the case of Sadhan Haldar v The State NCT of Delhi , the court noticed the use of the AFRS only for the purpose of tracking and re-uniting missing children. Whereas these orders of the Hon’ble High Court do not constitute as any basis of legality for the AFRS since no directions are made to establish any AFRS," IFF had said in an earlier notice to NCRB. 

They had claimed the AFRS deployment would produce a mass surveillance system that conflicts against any legitimate state goal and violate the principles of proportionality and necessity as articulated in the SC order on the fundamental right to privacy. The IFF stated in the notice: "The mass surveillance is conducted in two principal ways: (a) To populate the underlying data set on the basis of video surveillance in public spaces. Such databases are not limited by any legal framework and additional ones are being built to populate the data set; (b) Matching and deployment of the AFRS to the underlying data set in public spaces not restricted or defined in any manner whatsoever."


📣 The Week is now on Telegram. Click here to join our channel (@TheWeekmagazine) and stay updated with the latest headlines