Policy Watch
Data Dictatorship Disguised as Data 'Protection' Bill

IN the perfectly justified outrage over the Citizen Amendment Act which was passed by both houses of the Parliament in December 2019, another upcoming piece of legislation has missed the attention it deserves. In December 2019, the Ministry of Electronics and Information Technology (MEITY) tabled the Personal Data Protection Bill (DPB), and this bill is currently being scrutinized by a Joint Parliamentary Committee.

The nomenclature of this legislation suggests that it seeks to ‘protect’ data that is generated by and belongs to individual persons. In a sense, this legislation is meant to be designed as the (much needed) Indian version of globally accepted legal safeguards to ensure individual privacy. The European Union, for instance, has the General Data Protection Regulation (GDPR). The DPB, along with its counterparts in countries across the world, exists at a moment in history when data (including very personal data) has emerged as one of the most highly monetized and coveted ‘goods’ in the market, even as this data remains one of the main ways through which governments control and manage their  citizens. We will discuss the ways in which personal data lends itself to wide-ranging social engineering later on in this piece; let us first understand the various provisions of the DPB.

Trajectory of the DPB

As a piece of legislation meant to protect personal data (from both businesses as well as governments), the DPB’s self-defined central task is fairly well articulated. It has to reach a balance between three important functions and needs – the need to foster trade and industry in an increasingly data-driven world, the need to ensure that the State can meet its responsibility towards providing welfare and goods for its citizens, and the need to ensure civil personal rights. We shall see that the DPB, in its present form, has taken great care not to allow for substantive provisions to ensure individual rights.

Debates around this Bill began in July 2017, when the MEITY set up a committee to study issues related to data protection. This committee, chaired by retired Supreme Court judge Justice B. N. Srikrishna, submitted the draft Personal Data Protection Bill, 2018 in July 2018. Several provisions of this draft Bill were contested by a range of actors, including civil society bodies, protection of privacy activists and companies in the IT sector. The government received several depositions, and has since reworked some of the provisions of the original draft. What we have now is the Personal Data Protection Bill 2019 (which I shall refer to as DPB 2019 in this piece). What we need to understand at this point is that the DPB 2019 has actually diluted some of the crucial data protection and privacy provisions which were included in the draft DPB 2018. Essentially, the government has done two things in DPB 2019: it has given itself even more power to intrude on and utilize our personal data, and it has provided some leeway to IT companies thus blunting the sharp dissent from the IT sector.

Key Concerns and How the DPB 2019 Addresses Them

(Mis)Use of Personal Data

The primary concern around data usage has been to restrict the collection and processing of personal data. This concern is all the more important at a time when modern technology has created a data dystopia of sorts. As cyber-intelligence analyst Pukhraj Singh argues, we are currently in a bizarre situation where ownership, possession and control of data do not necessarily overlap (https://www.medianama.com/2018/07/223-data-protection-bill-comment/). The DPB 2019 states that the processing (collecting, recording, analysis and disclosure) of personal data should be done only for “clear, specific and lawful” purposes. The problem lies in the very definition of what is clear, specific and lawful. What qualifies as ‘necessary’ data is a matter which is wide open to interpretation. As of now, the DPB envisages an authority, the Data Protection Authority (DPA), which will be tasked with clearing the air about the definition of ‘necessary’ data. Chances are, unfortunately, that this definition will be kept loose and vague so as to allow for flagrant violations of privacy by public and private actors.

Privacy provisions in DPB 2018 have already been diluted in DPB 2019; and there is literally no right an individual has to ask for a reasonable explanation for decisions/acts that are committed on the basis of data collected from her. There is a ‘right to erasure’ – a right to demand that specific data generated by an individual be forgotten/erased from the records after the purpose for which it is collected is served. There is also a provision which allows that an individual can ask for ‘correction’ of data related to her. However, the efficacy of this right will be tested only when the regulations for DPB 2019 are notified. The devil, after all, is in the details and the fine print of the final regulations drafted by the DPA. The protection against data breaches are also quite weak. In the case of data breaches (such as leak of Aadhar data to unauthorized users), the individual whose data has been exposed need not even be warned. Only the DPA will receive this information.

Overarching Powers of the State

The DPB 2019 says that personal data can be processed by the State without consent, if it is necessary for the State to perform any of its “functions”. In other words, this allows various agencies of the State sweeping powers of access over our personal data. The State can collect any data it wants, and can even deny rights if this data is not made available to it. Moreover, in its latest avatar, the DPB 2019 gives the State the right to exempt any of its agencies from the ambit of the Act, in the course of performing its duties and protecting ‘national security’. This problematic exemption can for instance be provided by the government in the name of “interests of prevention of crime”. This will lead to a regime of anticipatory surveillance, where data can be collected and processed without having to site a specific investigation (criminal or otherwise). The requisition and use of WhatsApp data to profile supposed “stone pelters” in Kashmir is a case in point here. Anything and everything can be justified as a “preventive” measure, such as membership of an organization or participation in an event. The weak provisions under DPB 2019 also make it possible for governments to use the data collected from private citizens for its own political purposes, to create profiles of communities, areas, regions, individuals etc.

Data Localization

Where can personal data be collected, processed or used? The DPB 2018 had recommended that all personal data – including data which is generated, processed or used in India – would need to be stored in servers located in India. Also, DPB 2018 had said that ‘critical’ personal data will only be processed in India. Moreover, the processing and use of ‘sensitive’ personal data such as passwords, financial data, sexual orientation, biometric data, religion or caste would require explicit consent based on the data processer declaring the purpose of processing. The DPB 2018 argued that Indian authorities should have control over data that is managed inside the country, especially when it comes to crucial data. Indian authorities should not have to ask for (and possibly be denied) access to data from say businesses such as Facebook located in the US or Europe. For businesses whose operations depend on processing personal data, this was a veritable economic blow since huge amounts of investments would now be required to keep copies of all data they use on Indian servers. This clause would effectively mean that smaller companies and startups would be pushed out of the market, and even large companies would struggle to maintain current levels of profitability. Business interests lobbied hard with the government, and it looks like they have extracted some concessions. According to the DPB 2019, only ‘critical’ and ‘sensitive’ data now needs to be stored in Indian servers. And only such data would be subject to stringent provisions of explicit consent and the like. As of now, the definition of ‘critical’ data has been left wide open, and it is now up to the Data Protection Authority (DPA) to do so.

An All-powerful Data Protection Authority (DPA)?

The DPB 2019, as it creates a Data Protection Authority (DPA), allocates a whopping 40+ crucial jobs to this authority. The DPA now has multiple tasks: settling disputes over data protection and leakages, framing rules and regulations (make of which have been mentioned in this piece), advising the government as well as other data processors on data protection matters, and executing regulations drafted by it. There have been two central concerns regarding the DPA – first concerning its composition and the second concerning its technical competence. If the DPA has to perform its role well, the first concern is crucial. Structures have to be put in place to ensure that the DPA is independent from government and business interference. This should have been a given basic minimum condition. However, the DPB 2019 has further diluted the independence of the DPA compared to the provisions in the DPB 2018, even as it has vastly expanded the powers and scope of the DPA. The structure and composition of the DPA now has to get the green signal from the government, in essence curtailing its independence. We are likely to see the DPA becoming an extension of the government, with no teeth to oppose flagrant violations of privacy.

What we have now is a draft legislation which will place huge powers in the hands of government  bodies, even as it will not be able to control non-State actors effectively. It does not for instance ensure accountability for intelligence agencies, does not prevent the use of illegally obtained ‘evidence’. Coming close on the heels of the Cambridge Analytica scandal, the DPB was an opportunity to put some safeguards in place. It now seems like a lost opportunity.

The Chinese Model: Data Dystopia and Surveillance State as the New Normal?

The DPB, as we can see, has the potential to allow an extensive programme of mass, state-sponsored profiling and surveillance. We do not have to look too far, in fact, to realize that this could lend itself to a programme of social engineering of vast proportions. The Chinese ‘Social Credit’ system has some lessons for us here. Developed by the Chinese government, this system subjects every single citizen to elaborate surveillance by the State. At the end, each citizen is awarded a ‘score’, which then becomes (or not) the citizen’s ticket to availing of a variety of rights and services. In other words, the social credit system is the government’s way of punishing certain kinds of behavior, and rewarding others. Nothing possibly can be more authoritarian and symptomatic of the kind of ‘Big Brother Is Watching You’ dystopia imagined by George Orwell in the novel 1984.

In theory, the social credit system tracks and evaluates citizens for ‘trustworthiness’. It uses facial recognition systems and big data analytics for this purpose. And how is ‘trustworthiness’ mapped? One could receive negative ratings for a range of (state-defined) deviations: playing loud music, eating inside metros, violating traffic rules, making reservations at restaurants or hotels but not showing up, fraudulently using other people's metro cards etc. Lists of ‘good’ behavior have also been made; donating blood , volunteering to community services can earn positive points.

Clearly, the State can (and will) use this means to punish any dissent, any contrary opinion, any public display of protest against itself. In June 2019, some 6 million Chinese citizens were put on a blacklist and denied train and bus tickets for being “untrustworthy”. Blacklisted citizens are publicly named and shamed and deliberately exposed to mass punishment from prospective employers, neighbours and the like. There are reports that even children have been punished under this regime of mass control. Unless democratic opinion in India forces the government to backtrack on the DPB and instead work on a better, more substantive model of data protection, we are in danger of following the Chinese route of social control.

Liberation Archive