Analysis of the Facial Recognition Technology-enabled Surveillance Landscape in India

In the previous article of the two-part series, we conducted a cross-sectoral mapping exercise of how facial recognition technology (FRT) is being deployed in public services across India. Its vast and pervasive deployment across various States, industries, sectors and for varying purposes is an ode to its versatility and effectiveness. However, the absence of any regulations governing FRT, or even AI for that matter, creates a murky area to tread into as it raises several risks and hazards. To understand the issues that arise out of FRT-enabled state surveillance, it’s important to assess the three layers where regulatory intervention is dire: the development of the software and hardware, its implementation and deployment, and the existing legislation and regulation.

Development of the Software and Hardware

Since such deployment is effectuated through a tendering process, private parties play a huge role in the development of ethical technology. Despite being commissioned by government agencies and departments, there lies a moral and ethical responsibility on private parties to develop technology that is not inherently dangerous in nature and susceptible to misuse. In addition to this, they must ensure that the technology is least intrusive and adopts the maximum security measures and safeguards. For example, in several of the instances of deployment stated above, internet protocol (IP) cameras have been installed for public mass surveillance. Despite being faster, more efficient and wireless, they are more susceptible to unauthorized break-ins and hacking, as they are not interlinked on a closed-circuit network. Several incidents and reports across the world have shown that IP cameras are surprisingly easy to hack, with those installed in public spaces being generally unsecured and exposed to multiple vulnerabilities. [1] However, the extent of bargaining power and control exercised by private parties in the tendering process is minimal thereby resting the authority, liability and responsibility solely with the State.

Further, in the development phase, principles of privacy and data protection must be included in the technology’s infrastructure and data architecture, such as privacy by design. Clear guidelines to determine the storage, retention and transfer of data must be laid out. Since the technology and IP behind the FRT-enabled CCTV cameras are licensed by private parties, few to no limitations are imposed to restrict them from storing or retaining this data. Further, this data can even be retrieved and traced back by private parties. In case the cameras are being sourced from abroad, as in the case of KSR Bangalore Station sourcing their tech hardware from Belgium, would they have access to this data? What would be the implications of such cross-border data transfer in the absence of any regulation or data protection regime? Sadly the opacity behind the processes leads to increasing questions of necessity, proportionality, and a legal basis.

Implementation and Deployment

During the stage of implementation and deployment, unfortunately, the bidding and supporting documents of the tenders provide arbitrary accuracy rates, with no scope for misidentification due to false positives, or false negatives as well. The assessment in the Indian landscape misses out on the possibility of technical impossibilities or limitations of such tech, leading to inaccurate results. It even evades the mention of possible AI bias and measures the recourses the State may adopt to counter this.

Further, no technical compliance or regulatory standards have been imposed on the State to conduct training and testing measures before deployment. To take this further, no requirement for an ethical analysis of the Automated Facial Recognition System’s impact on minority and vulnerable communities has been made.

Moreover, there is no unanimous definition of the term ‘criminal’ in the numerous Central and State law enforcement agencies working on crime detection at various levels. In certain situations, it’s stated that it would only apply to previous offenders and history-sheeters, whereas in others it would extend to warranted criminals, pre-trial accused persons and even alleged offenders. Such arbitrary and discretionary criteria raise doubt on the arbitrary and discriminate application of FRT systems.

Legislations and Regulations

Recently, the Digital Personal Data Protection Act 2023 was enacted to govern the collection, handling, processing and transfer of personal data. However, its application on state-enabled FRT systems would be murky with the vast exemptions provided to the State and government entities. Under Section 17(1)(c) of the Act, if the processing of personal data is in the interest of prevention, detection, investigation or prosecution of any offence, the obligations of a data fiduciary and rights of a data principal would be inactive. To escalate this, if any personal data is processed by an instrumentality of the State in the interests of sovereignty and integrity of India, security of the State, friendly relations with foreign States, maintenance of public order or preventing incitement to any cognizable offence, then the entire Act would be inapplicable.

Hence, in the absence of any legal recourse from the Digital Personal Data Protection Act 2023, the sole legal authority and direction would be provided by the judgment Justice K.S. Puttaswamy and Anr. v. Union of India [AIR 2017 SC 4161], and safeguards under the Information Technology Act 2000. By this principle, each of these projects established must meet the test of necessity, proportionality and least restrictive use. Beyond this, state surveillance mechanisms would be able to evade the application of existing Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 as it solely applies to body corporates.

The absence of internal guidelines, policies or self-regulatory standards on collecting, processing, handling, storing and transferring sensitive personal data in the context of the multi-crore FRT projects provides State bodies and agencies with a freeway to establish and monetize on mass datasets of its 1.4 billion citizens. The risks that such systems pose to privacy, civil liberties, and human rights are sadly not subsided by its pervasive mass deployment.


The exploration of facial recognition technology in India reveals a landscape where technological progress confronts critical concerns regarding privacy, civil liberties, and the potential for misuse. The extensive adoption of FRT across diverse sectors, as outlined in the article, underscores the urgent need for regulatory frameworks to keep pace with technological advancements. Its integration into law enforcement agencies for crime detection, the deployment in smart city initiatives, and the use in contactless transportation experiences highlight its multifaceted applications with equally prevalent risks and challenges. Paramount among them is the overarching issue of privacy infringement, compounded by the absence of robust legislation to safeguard individuals from unwarranted surveillance and data misuse.

The absence of specific regulations governing FRT leaves a void in accountability, transparency, and oversight, posing a significant challenge to ethical implementation in the absence of potential biases and inaccuracies in FRT which discriminately impact marginalized communities. The need for data security also emerges as a critical concern, with the potential for unauthorized access, data breaches, and unregulated storage of sensitive personal information heightening the risks associated with FRT.

A crucial policy and legal analysis reveals the limitations of the recently enacted Digital Personal Data Protection Act 2023, as it grants broad exemptions to the state, limiting its effectiveness in addressing FRT-related concerns. Judicial oversight, primarily anchored in the Puttaswamy judgment, underscores the necessity for rigorous scrutiny to ensure that FRT projects adhere to the principles of necessity, proportionality, and the least restrictive use. Moving forward, India must enact comprehensive legislation, engage in transparent public consultations, and prioritize ethical standards to strike a balance between technological innovation and individual rights. International collaboration and the incorporation of best practices will further aid in developing a balanced and effective regulatory framework for FRT in India. In essence, navigating the evolving landscape of state surveillance responsibly requires proactive policymaking, ongoing scrutiny, and a commitment to ethical and legal considerations.

Related Posts