Complexity will be the privacy law’s undoing
The Justice Srikrishna Committee’s report on data protection proposes a user-centric framework, emphasizing data portability and privacy by design. However, its approach to consent, applying product liability principles and creating a complex, multilayered consent framework, may be impractical and burdensome for businesses, particularly startups. These measures, while aiming to enhance privacy, could introduce additional friction for users and businesses, potentially exacerbating consent fatigue.
This article was first published in The Mint. You can read the original at this link.
The Justice Srikrishna Committee report is finally out and initial reactions are mixed. The introductory chapter promised a modern data-protection regime that would address the challenges of a data-rich world, and deliver a personal data protection law that ensures personal privacy and autonomy—while at the same time allowing for data flows and the creation of a free and fair digital economy. Having spent most of the weekend poring through the report and the accompanying draft Bill, I am sorry to say it flatters to deceive.
The proposed privacy framework suggests the creation of a user-centric design, recasting the relationship between the data subject and the controller as one of a data principal with its fiduciary. This design choice—to make the data subject the central focus of the legislation—has my whole-hearted endorsement. I am particularly glad to see the right to data portability that will allow users access to their personal data across silos. Similarly, the principles of “privacy by design" will force organizations to think of incorporating privacy from the ground up.
While the committee has acknowledged that the preponderance of evidence indicates that the operation of notice and consent on the internet is broken, it has explicitly declined to adopt my solution of putting in place an accountability framework to address this. Instead, the committee has chosen to fix how consent works by articulating a modified framework for operationalizing it. On the face of it, I have no argument with this approach. It is, after all, the way much of the world addresses privacy concerns. However, in the manner in which it is being implemented, I believe the committee might have made things worse.
The committee has taken the offbeat approach of applying product liability principles to data- protection law—treating the consent form as a product. It believes that this will enable holding the data fiduciary liable for a failure to adhere to the notice terms, as well as if the form of the notice is inconsistent with the data protection law. The data-protection authority is supposed to issue model consent forms that fiduciaries can use as templates. But given the range and diversity of data collection that takes place, I can’t imagine how any such list will be comprehensive.
In addition, the committee intends to put in place a data trust score framework that will apply to all significant data fiduciaries. It is also looking to implement a dynamic consent-renewal obligation, forcing data principals to periodically refresh any consent they have previously provided. There is also talk of reducing consent fatigue by deploying consent dashboards that will frequently ping us with short, easy-to-understand and just-in-time notices.
As innovative as all of this sounds, the practical and technical challenges of implementing a multilayered consent framework is non-trivial. Businesses will have to entirely rejig their processes to accommodate these new requirements. As much as we might say that this is the necessary cost of privacy, the burden of compliance will hit early-stage businesses the hardest and could well have a chilling effect on innovation. I am, as much as the next man, in favour of greater safeguards on user privacy. However, the committee’s hydra-headed approach to operationalizing consent will demand so much from the average user that its benefit is unlikely to accrue to any, but the most privacy-aware among us.
In addition, businesses also have to grapple with new data localization obligations that require a live mirror of personal data to be stored on servers in India. This means that the digital economy which is, at present, free to leverage the global cloud as it chooses, will be forced to maintain servers in India to comply with this requirement. Apart from the technical challenge of building live mirrors at internet scale, the cost of maintaining additional redundant layers of infrastructure is an unnecessary expense.
The committee has also proposed separate regulations to deal with the treatment of children’s personal data. Websites and online services directed at children below the age of 18 that have been notified by the authority cannot profile or track them, monitor their behaviour or target advertising towards them. Given the amount of teens who spend much of their time online, this obligation could well affect the design of most internet businesses, which will all have to implement age verification so that they can appropriately switch off the profiling that is central to their business models. While I have no issue with applying stringent measures to protect the very young, using the age of majority as a cutoff is lazy and unreflective of the reality of internet use.
As much as businesses will struggle to adjust to the new framework, it is we, its users, who will have to deal with the additional friction that this introduces into our online lives. The law requires that consent be free, informed, specific and clear. Service providers will, as a result, no longer allow us to simply accept new terms of service. They will instead insist on interspersing additional steps in the process to demonstrate they have our full and informed consent. It is possible that these measures will likely perpetuate the very fatigue that they were designed to obviate.