The TikTok Case: What we need to know

Last month, the Madurai Bench of Madras High Court passed an ex parte order inter alia banning download of TikTok citing inter alia, social morality argument and the risk of child pornography, and alleged suicides arising out of usage of content hosted on the App. The order was challenged before the Supreme Court where TikTok argued that it is a mere “intermediary” by Indian cyber-security law are therefore cannot be held liable for actions of third parties on the platform.

The Supreme Court directed the High Court to consider the objections. The High Court subsequently heard the arguments of TikTok: the ban on TikTok subject to the condition that the platform should not be used to host obscene videos.

Analysis

The TikTok case highlights the importance for an intermediary to be able to safely claim “safe harbor” defences. TikTok was clearly an “intermediary” in terms of the India’s Information Technology Act, 2000 (“IT Act”) and Information Technology (Intermediaries Guidelines) Rules 2011 (“Intermediary Rules“). This was not a contested point.

S. 79 of the IT Act grants a conditional safe harbor to an intermediary i.e. an intermediary is not liable for any third party content hosted or made available through it when:

  1. the function of the intermediary is limited to providing access to the system; or
  2. the intermediary does not initiate, select the receiver of or select/ modify the information contained in a transmission; and
  3. the intermediary observes due diligence and abides by other guidelines prescribed by the Government.

The Intermediary Rules provide a diligence structure to be observed by an intermediary to avoid intermediary liability under S.79. Key requirements of this diligence standard are:

  1. the intermediary must publish rules, regulations, user agreement and privacy policy for access or usage of its computer resource by the user [Rule 3(1)];
  2. the intermediary must inform its users not to transmit any information that is grossly harmful, harassing, obscene, pornographic, pedophilic, libelous or invasive of another’s privacy (“Prohibited Content”) [Rule 3(2)];
  3. the intermediary must “act within 36 hours” of receiving knowledge of the transmission of any prohibited information [Rule 3(4)][1];
  4. the intermediary must disable information that is contradictory with the Act and the Rules [Rule 3(5)];
  5. The intermediary must appoint a grievance officer to redress the complaints of users within one month from the date of receipt of complaint [Rule 3(11)].

This case while high-profile, does not create new law in this evolving jurisprudence. It still remains relevant for the role of due diligence defences to claim intermediary liability. It was TikTok’s demonstration of due diligence which justifiably made the Court comfortable for TikTik to receive the benefit of being an intermediary, i.e. no liability for content.

It is therefore relevant to assess the cluster of mechanisms of TikTok as it was this which resulted in the Court deeming “due diligence” requirement under S. 79 of IT Act satisfied. TikTok argued that the following constituted “industry standard solutions to address problems relating to pornography and illegal behaviour”:

  • The user accepted “community guidelines” that educates the user to not post, share or promote any of the (i) harmful or dangerous content, (ii) graphic or shocking content, (iii) discrimination or hate speech, (iv) nudity or sexual activity (v) child safety infringement, (vi) harassment or cyber-bullying, (vii) impersonation spam, or other misleading content (viii) intellectual property and workplace content (ix) other malicious activity like virus, etc.;
  • In-app, a report-content feature, through which users can instantly report any objectionable content and have the same taken down, the average response time for which is 15 minutes though law provides a window of 36 hours;
  • A grievance officer located in India duly appointed as per IT Act to handle complaints or other issues faced by a user of the App even without any login to the app;
  • A private, dedicated channel for local government inquiries. It prioritises content take down requests from the government authorities including law enforcement and expeditiously remove content that violates its community guidelines or local laws;
  • A highly efficient and trained content moderation team that is also located in India with proficiency in 16 Indian languages, which exercises control over content on the platform in line with international best practices, policies and local laws;
  • A safety centre in ten Indian languages, to ensure that the resources and guidelines are as inclusive as possible. These safety centers guide users, especially parents, through the app;
  • Terms of use, privacy statement, community guidelines and appropriate policies that are publicized to all users with regard to conduct on and usage of the platform;
  • Automated tools that are industry standard that can detect pornography content that might be posted and immediately remove the same;
  • Number of measures in place to protect users from misuse. These include:
    1. Giving users the choice to make their account private so they can restrict content to approved followers only.
    2. Giving users the choice to block other users.
    3. Giving users the choice to filter comments by keywords.
    4. Blocking a number of potentially problematic terms from search and discovery.
    5. Disabling the ability to receive private messages from other users.
    6. Making an individual video post private.
    7. Shutting down the React / Duet functions by video (the react/duet functions are in-app features that allow users to create new videos based on existing videos on TikTok).
    8. Disabling the downloading of an video by another user.
    9. Pop-up warnings for high-risk content.
  • Specific measures to curb the use of the platform by children, including an age that only permits teenagers and above to access the platform.
  • Password protected parental controls and restricted mode to enable parents and guardians to exercise supervision over the use of the app; and
  • Advance privacy features allow users to, inter alia
    1. control who they interact with on TikTok;
    2. decide the visibility of their specific videos before and after posting of those videos; and
    3. Control/disable comments that can be made on their videos, both before as well as after uploading of videos.

It is useful to recollect that the “knowledge” requirement under S. 79 of the IT Act has been scaled up to a standard where an intermediary is deemed to have actual knowledge of the content only on the basis of the order of a court or government. This doctrine was established in Shreya Singhal[2] by the Supreme Court and presented a notable shift from the position of law at that time.

It is relevant to note that the much publicized Christian Louboutin case[3] is not highly relevant to intermediaries because in this case it was held that the defendant was an online marketplace and not an intermediary.

Conclusion

Two broad requirements have to be fulfilled to claim intermediary exemption:

  1. An entity has to be an intermediary to begin with; and 2. The conditional exceptions i.e. the safe harbor requirements under S. 79 must be met with.

TikTok represents a colorful example of the importance of implementing measures that will then help establish the due diligence defence under the IT Act.

[1] Constitutionality of Rule 3(2) and 3(4) have been challenged by MouthShut.in v. Union of India W.P. (C) No. 217 (2013) (Supreme Court of India)

[2]Shreya Singhal v. Union of India W.P. (Cri.) No. 167 of 2012 (Supreme Court of India)

[3]Christian Louboutin SaS v. Nakul Bajaj & Ors. CS (Comm) 344/2018 (Delhi High Court)