Appealing Moderation
The draft amendments to the Information Technology Rules, 2021, will require intermediaries to align community standards with Indian law and create a Grievance Appellate Committee for “problematic content.” Critics view this as a tool of government censorship, while others see a need for balance between government control and private enterprise.
This article was first published in The Mint. You can read the original at this link.
Last week the Government of India released a set of draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. Once enacted, intermediaries will have to ensure that the community standards to which they hold their users answerable, comply with Indian law and its constitutional principles. This, the government clarified, has become necessary because a number of intermediaries have taken it upon themselves to act in violation of the rights of Indian citizens.
The new amendments also propose the constitution of a Grievance Appellate Committee that will be tasked with dealing with “problematic content” in an expeditious manner. Users unsatisfied with how their complaint to an intermediary has been handled will now be able to appeal the decision to this body. And have it resolved within 30 days:
It is proposed to create an appellate body called ‘Grievance Appellate Committee’ under rule 3(3) of the IT Rules 2021 by invoking section 79 of the IT Act having regard to additional guidelines as may be prescribed by the Central Government. Users will have the option to appeal against the grievance redressal process of the intermediaries before this new appellate body. The Committee will endeavor to address the user’s appeal within a period of 30 days. This is made necessary because currently there is no appellate mechanism provided by intermediaries nor is there any credible self-regulatory mechanism in place.
The proposal has been met with varying degrees of consternation. Various newspaper articles, have called this yet another attempt by the government to curtail free speech. The remark that some intermediaries have acted in violation of the rights of Indian citizens is being read as a snide reference to instances when intermediaries refused to take down content that did not violate their community guidelines - despite pressure from the government to do so. What the government sees as an escalation mechanism to provide redress to users against unfair decisions of the social media platforms they subscribe to, civil society views as just another tool of government censorship.
Both sides are right. And a wee bit wrong.
A Wicked Problem
Content moderation is a wicked problem. It calls for every piece of suspicious content to be evaluated against a number of different legal standards - authorship, the harm it might cause to a person’s reputation, the legality of that content in the context of the age of its intended audience, and many more. As much as social media platforms are designed to enable free speech, they must also eliminate - or at least, mitigate - the harms that could arise from speech unfettered. They need to arrive at a balance between the rights of the persons posting and those they will offend.
These are the sorts of decisions that content moderators have to make. They have to decide what stays up and what must be taken down. Where to draw the line between speech that which is acceptable and that which is not. Most often the issues are so clear-cut that the even lightly trained content moderators will get it right. Every now and then even the most experienced among their ranks will not.
Some of the decisions they have to make in the course of a day’s work are so gnarly that even the finest legal minds would have been left scratching their heads: are disparaging remarks posted about an individual defamatory as alleged - do they malign his character with falsehood or are they in fact based in truth; is this re-mix of a existing song original enough to qualify as a novel work or does it need a license from the copyright holder before it can be posted; is a given statement expressing angst about a decision just normal human frustration or an attempt to foment a violent agitation.
Grievance Appellate Committee
Civil society is concerned that if the proposed Grievance Appellate Committee allows the government to have the last word on questions such as this last one, it will exercise this power to quell dissent.
As much as I share this concern, I have similar reservations about leaving all these decisions entirely in the hands of private enterprise.
After all, not all appeals to the Grievance Appellate Committee will be about government take-downs. Some will address illegal content - like violations of copyright. What if an artist’s original composition is marked for take-down because of some imagined infringement that is, even on appeal, not reversed. To most up-and-coming artists their only path to commercial success lies in being able to impress the audiences that these social networks have to offer. If they are forced, without recourse, to take their content down, that could be the end of their careers.
That said, it is impossible to ignore the concern that civil society raises. When an appeal is preferred by a government agency whose take-down notice has been rebuffed, is it not likely that a government-appointed appellate committee will find in favour of its own agency?
Self-Regulatory Body
How do we mitigate against this eventuality. And still preserve the right to appeal?
One solution could be for the industry to establish a self regulatory appellate body to which appeals from all content moderation decisions can be preferred. It could be staffed with a cross-section of experts from industry and the law so that its decisions will be sufficiently robust - informed both by industry context as well as taking into account applicable laws and judicial precedent.
Ideally, this body should operate as an appellate forum for all content moderation decisions regardless of the platform from which the appeal originates. This will take it outside of the reporting hierarchy of the platforms themselves offering the process a measure of independence that is absent in internal grievance redressal systems. Since it is not operated by the government it will, hopefully, have the neutrality required to remain impartial while deciding on take-down notices issued by the government.
The government has already indicated that it is open to considering self-regulatory alternatives. Now the industry has to get the design right.