An article co-authored by Mishi Choudhary and Eben Moglen (Chairman, Software Freedom Law Center, New York) was published on ET Tech on January 29, 2018. In context of the Government’s on-going public consultation on data protection, the article spoke about how the fundamental premise of any data protection framework must be that people should have control over their own data.
The objectives of data protection legislation must be described in terms of people, not data. It should also not be about consent, but control.
A committee of experts appointed by Ministry of Electronics and Information Technology (MeitY) is discussing data protection framework as the Supreme Court is discussing Aadhaar while the entire world is discussing Facebook’s admission that it doesn’t know whether social networking has good or bad overall effects on democracy.
The constant flow of news about data breaches, whether at Equifax or UIDAI, is normalizing massive losses of personal data.
Services we all rely upon because they are so “convenient” are imposing on us the overwhelming inconveniences of constant surveillance, identity theft, and other cybercrime, and the constant dread of a system we know can hurt us in ways that we do not understand. This is not the 21st century digital society we wanted to live in.
What we call “data protection” law must be our guarantee of digital safety against mass accidents and destruction of individual and social welfare. But even the name we give to the subject helps to deceive us: we are trying to protect people, not data.
To work correctly, this form of law must state its objectives properly, regulate effectively to achieve those objectives, measure and apportion liability swiftly when harm occurs.
The objectives of data protection legislation must be described in terms of people, not data. All persons are entitled to control the collection of information about them: about their bodies, their behavior, and their thoughts.
This is a conclusion reinforced by the Supreme Court’s judgment that we have a constitutional right to privacy. But personal information is information about our relationships as well, so information collected about us affects also our families, our friends, and our communities.
This is why “consent” is not a sufficient basis for determining the responsibilities to protect data about people. If I consent to allow a platform company to read all my email in return for giving me free email service, I am also giving away information about the people I correspond with, without regard to their wishes.
Even if I think the chance of harm to me from loss of my information by the platform company is a risk worth taking in order to have free email service, that may not be acceptable or appropriate for my children or my friends. Nor can I possibly know how future events in my life, or changes in technology or society, might affect my understanding of my safety or my best interests. Today’s easy consent might very well be tomorrow’s disastrous mistake.
So, the law we need is not about getting, managing, or automating consent. The objective is not consent, but control. People should be able to control access to information about them, as the DigiLocker allows citizens to control access to certain government-related documents.
Rules requiring parties processing data about individuals to return to the individuals what their processing has produced (“copyleft” rules for personal data) also further the objective of control.
The purpose of data legislation is not to “unleash innovation” or to subsidize start-ups with favorable legal rules. Because the objective is safety and control, the regulations the law should advance will affect technology by requiring what data subjects, the people, need to be safe and in control.
Regulations should require that people *know* who is requesting data about them, not that they should consent not to know and let everything operate in the dark.
Logging of requests for personal data should result in logs individuals can access, using publicly-available tools to analyze and understand who is seeking information about them, for what purpose, under what safeguards, in real time.
Accountability built into the system will enable us to teach students, workers, and citizens about safe data practices, allowing them to build skills in managing data while increasing their own data safety and the safety of their families.
In addition to regulations requiring accountability, so that people can measure their own safety and control their own data, appropriate data safety law should provide rapid recourse for people when their safety or welfare are violated by data breaches.
Parties engaged in the large-scale processing of personal data should post performance bonds with a public regulator, providing recourse for citizens, through that regulator, who have probably been harmed by those parties’ failure to meet the law’s requirements avoiding multiple years of litigation.
By defining the objectives of data safety law in terms of people, not data; by replacing the concept of “consent” by socially-determined general standards of care; by regulating technologies in the market to assure safety and personal control “designed in,” rather than “bolted on”; by providing rapid and effective recourse to compensate persons suffering actual harm – in these ways, Indian data safety law can meet our actual, human needs.
We want to live in Digital India. Just not the digital India we are living in at the moment.
(Mishi Choudhary is the Managing Partner of legal policy firm, Mishi Choudhary & Associates and Eben Moglen is a professor of law and legal history at Columbia University, and is the founder, Director-Counsel and Chairman of Software Freedom Law Center. Views expressed above are their own)
Protecting people rather than data with Data Protection