The Copyright (Amendment) Act (2019) (the Act)introduced a host of reforms to the national copyright regime. One of the more notable changes was the introduction of a Notice and Takedown (NTD) procedure through which Internet Service Providers (ISPs) can obtain safe harbor upon compliance. This piece follows up on the discussion introduced in an earlier post on the digital rights implications of the NTD structure of the Act.

This piece carries on from the conclusion of the earlier blog by analyzing the implications of ISPs’ use of automation in the discharge of their statutory duties. In this vein, this discussion will centre on the growing use of automated tools and processes and the consequent impacts on the copyright landscape.


As Penney declares, the age of automated legal enforcement is at hand.[1] Technological progress opens a wide breadth of possibilities in which automated tools can make legal processes more efficient and effective. Tools such as predictive policing, cognitive computing and AI based surveillance have been reported to significantly improve the efficiency of enforcement tasks.

Copyright enforcement has traditionally been an arduous task.  Logistical, financial and practicality reasons have often meant that an individual rights holder will often face numerous challenges in attempt to protect his/her original work from being copied, reproduced or otherwise distributed without his/her consent. This difficulty was exacerbated by the dawn of the digital age which raised infringing capabilities significantly. A panacea to these issues has been touted to be the use of automated tools

Ordinarily, in a case of copyright infringement, a rights holder must prove:

  1. That he/she is the valid owner of the work and has authority to bring a suit;
  2. That the Defendant reproduced, adapted, distributed, leased or broadcast the copyrighted work without the owner’s authorization;
  3. The actions of the Defendant do not fall under the defence of fair dealing.’

While this process is relatively straightforward for offline infringement, certain elements of cyberspace introduce difficulties not only to rights holders but also to enforcement procedures generally.[2]  In this regard, algorithmic processes have the potential to transform the task of copyright administration and enforcement. It is far more efficient for ISPs to embed these tools into their operations rather than having to hire a dedicated workforce to deal with takedown notices. These tools are also more effective than human decision-making capabilities in certain instances due to their capacity to ensure a degree of consistency in response to notices. Additionally, their ability to respond to infringement in real time coupled with the immense volumes of takedown requests that platforms receive increase the difficulties for human actors to play this role. To some, automation may be an inevitable course of action to address the intractable challenges of the digital environment.[3]

As a result, the use of automated takedown systems has proliferated in recent times. The popularity of these systems arose from the difficulties posed by the architecture of the internet. The cloak of anonymity offered by cyberspace and the general ease of access to protected works have created suitable conditions for rampant online piracy. In response to this, copyright owners and ISPs have turned to automation in a bid to curb the vast amounts of infringement. This was in the form of Automated Notice and Takedown Systems.


There are two main forms of automated systems. The first is used by copyright holders particularly large film or music companies who develop these tools to scan the internet for any snippets of their copyrighted material. Once the system identifies content that it deems to be copyrighted, it will either inform the rights holder of the positive match or more frequently, it will issue an automated takedown request directly to the host site.[4]

The second automation system, usually in response to the initial form, is developed by ISPs. Certain ISPs, particularly those hosting user generated content have sought to prevent conflicts with rights holders and implement NTD procedures by way of development of Content Recognition Technology. These tools carry out an analysis or a screening of content uploaded by users to determine whether they contain any possibly infringing material.[5] If uploaded content is deemed by the system to contain aspects of material subject to copyright, it will execute the pre-programmed instructions i.e. usually to take down the content.[6] These ‘robotic’ takedown procedures have become the standard for ISPs dealing primarily with user generated content. Additionally, due to the algorithmic nature of their operations, such systems are entirely automated. Thus, in most instances, there is no actual human actor reviewing the content at any stage to verify the substance of the takedown notice.[7]

Such systems usually work conversely to the regular manner mandated by law and constitute a proactive approach to notice and take down.[8] Given the fact that relevant laws only allow them a very limited amount of time to respond and coupled with the impracticality of responding to each notice,[9] ISPs also turn to automation to assess these claims.  


The use of automated systems in the legal sphere raises significant concerns on the potential impact on due process and human rights. As Hartzog notes, ‘the use of these technologies without careful consideration poses distinct dangers to our civil liberties’.

The inherent danger with mass adoption of algorithmic processes is the capacity to take down content that does not violate anyone’s copyright. Automated systems usually lack the capacity to differentiate between what constitutes ‘fair use’ vis-a-vis actual copyright infringement. As computerized processes, these systems are programmed through content recognition technologies and rely on fingerprinting and watermarking tools using samples of protected work to scan for similar material.  Once a positive match is found, the system executes the programmed instructions.[10] Therefore, once a certain threshold of similarity is achieved (in its assessment), these systems simply takedown content regardless of the circumstance.

Fair dealing/use is an exception to a copyright holders’ exclusive rights that allows others to make limited uses of the protected work. It is based on the principle that certain ‘privileged’ uses fall within the purview of public interest and as such should not be categorized as infringement. Under the Copyright Act (2001), copyright does not include the right to control acts done “by way of fair dealing for the purposes of scientific research, private use, criticism or review, or the reporting of current events subject to acknowledgement of the source.”[11]      

Automated systems work by using software tools to compare the “fingerprints” of protected work to user files uploaded on the platform and takedown the detected matches irrespective of variations or context of use. As such, they often block content falling within the fair use/dealing category. Consequently, the systems while attempting to prevent infringement, run counter to the fundamental goal of copyright law to strike a balance between the interests of owners and the public.

These systems are also potent for abuse. They can be used as a “sword” rather than as a “shield.” Instances where they can be used as a sword include where automated takedown notices are used to censor criticism or stifle competition.[12] Jeniffer Urban and Laura Quilter published a landmark study containing empirical evidence noting that a considerable amount of automated takedown notices are often targeted towards content that isn’t actually infringing.[13] This study found that a large percentage of Google Search notices were directed at competitors. Further, a 2011 article noted that scammers use YouTube’s copyright management tools to steal ad revenue from content creators. YouTube’s Content ID system comprises content recognition technology that allows rights holders to upload music and videos they own to a “fingerprinting” database. When account holders upload their videos, Content ID scans new uploads against its copyright database for matches.[14] Upon finding a match, the copyright holder can have the video removed or choose to place advertisements on the video and earn revenue on the video i.e. monetization.[15] While this system is highly effective in resolving the vast multitude of copyright complaints on the platform,[16] it is easily susceptible to abuse. Scammers may take advantage of this system to assign copyright to material they do not own.[17] The net effect is that they will obtain ad revenue belonging to content creators.

Additionally, while automation fundamentally changes the nature of copyright administration and enforcement, it may result into other issues such as automation bias. Automation bias is the tendency by human actors to over rely on automated tools and the commission and omission errors resulting from such over reliance.[18] The issue here is that the use of automated tools in copyright enforcement to reduce instances of human error may have the effect of changing the types of errors made rather than increasing efficiency. The pertinent question then becomes, ‘what degree of accountability should be placed on the users of such processes in order to prevent instances of such bias?’

The use of automated aids also introduces a level of opacity to the decision-making process. As noted by the previous piece, NTD processes in their capacity as private adjudication mechanisms already produce little public information.[19] Automated enforcement processes take the opacity to greater levels since these systems make choices to block access based on an undisclosed, self-determined threshold.[20] While dominant intermediaries such as Google and Facebook detail the general working of their automated systems, in individual takedown instances, users are not provided with the particular aspect(s) of their content that triggered the system’s enforcement mechanism.[21]

Further, given that copyright and related rights are administered in Kenya mainly through Collective Management Organisations (CMOs),[22] it seems reasonable to assume that these bodies would have an interest in the development of such systems.[23] The Act implies this interest by requiring that any takedown notice be copied to the Kenya Copyright Board which is responsible for licensing and supervising these organisations.[24] As corporate bodies, the CMOs have a greater bargaining power when negotiating with ISPs. As such, another question that arises is whether these bodies should develop such systems and what mechanisms could be used to ensure accountability in this regard.


Recent years have seen an increase in ISPs’ use of automated copyright enforcement tools. Its early promise has however been tempered by subsequent inability to decipher the particular contours of copyright law as well as its potential for abuse by threat actors. In light of these considerations, it is imperative for further development and optimization to be undertaken before these systems can be definitively labelled as the industry standard. Additionally, the transparency of such systems should be of priority as well as measures to curb their propensity for abuse. Only then will automation assume its rightful place in copyright enforcement.

[1] J.W Penney, ‘Privacy and Legal Automation: The DMCA as a Case Study’ 22 STAN. TECH. L. REV. 412 (2019) at 

[2] Some of these challenges include ascertaining the direct infringer i.e. is it the account holder or a subscriber or is it the platform which facilitates such infringement. Additionally, enforcement poses its own challenges as well. Traditionally, cease and desist letters’ would be the first stage in IP infringement before the actual suit. However, with regard to online infringement, one may not be able to verify the contact address of the particular account holder.

[3] Advisory Committee on Enforcement, WIPO, ‘Study on IP Enforcement Measures, Especially Anti-Piracy measures in the Digital Environment’ at

[4] Z. Carpou, ‘Robots, Pirates, and the Rise of the Automated Takedown Regime: Using the DMCA to Fight Piracy and Protect End-Users’ Columbia Journal of Law and the Arts, at

[5] B. Depoorter, P. Menell, D. Schwartz, ‘Research Handbook on the Economics of Intellectual Property Law, Volume 1’ at

[6] K. Erickson, M. Kretshmer, ‘“This Video is Unavailable”: Analyzing Copyright Takedown of User-Generated Content on YouTube’ at

[7] A Berkeley study noted that a category of Online Service Providers (OSPs) who receive large numbers of automated notices have due to resource and time pressures implemented what the authors term ‘DMCA Auto’ measures to handle large scale notice processing which the study notes take down content with little or no substantive human review. See J. Urban, J. Karaganis, B. Schofield, ‘Notice and Takedown in Everyday Practice’

[8] The EU’s Electronic Commerce Directive does not define so called notice and action procedures under Article 14 of the Directive. Member states thus implement diverging approaches on the duty to act expeditiously.

Additionally, DMCA Procedures do not consider the existence of content recognition technology (See Perfect 10 Inc. v. Inc, a US Court of Appeals’ decision regarding ISP’s ability to detect infringing content at

[9] S. 512 (c) of the DMCA mandates ISPs to expeditiously take down or block access to the material. The Amendment Act only gives ISPs 48 business hours to do the same.

[10] F. Romero-Moreno, ‘Notice and staydown’ and social media: amending Article 13 of the Proposed Directive on Copyright’ at

[11] Section 26(1) (a), Copyright Act (2001)

[12] Z. Carpou, ‘Robots, Pirates, and the Rise of the Automated Takedown Regime: Using the DMCA to Fight Piracy and Protect End-Users’ Columbia Journal of Law and the Arts, at

[13] J. Urban & L. Quilter, ‘Efficient Process or “Chilling Effects”? Takedown Notices Under Section 512 of the Digital Millennium Copyright Act’, 22 Santa Clara Computer and High Tech (2006) at 

[14] YouTube Help, ‘What is a Content ID claim?’ at

[15] YouTube Help, Copyright Management Tools at

[16] P. Resnikoff, ‘99.5% of all infringing music videos are resolved by Content ID, YouTube claims’ at

[17] D. Kravets, ‘Rogues Falsely Claim Copyright on YouTube Videos to Hijack Ad Dollars’ at

[18]K. Goddard, A. Roudsari, J.C Wyatt ‘Automation bias: a systematic review of frequency, effect mediators, and mitigators’ at

[19] J. Urban, J. Karaganis, B. Schofield, ‘Notice and Takedown in Everyday Practice’ at

[20] M.Perel, N.E Koren, ‘Accountability in Algorithmic Copyright Enforcement’ at

[21] Users lack access to the system’s analysis of their content and which particular aspects were deemed to be infringing i.e. a certain clips in a video or segments of a song. Such an analysis would assist users in avoiding future infringement. Due to this lack of transparency, automated enforcement systems do not allow for public scrutiny that can bolster accountability.

[22] Section 46, Copyright Act (2001)

[23] Collective Management addresses the plight of rights holders by allowing them to administer their rights through such CMOs which then negotiate with users of protected works in order to come up with agreements that facilitate rights clearance in the interest of both parties. Thus, with regard to automated NTD procedures, local CMOs may have a legitimate interest in ensuring that their members’ works are protected by such processes. This also raises the question as to who should be accountable for the efficient functioning of these systems i.e. is it the ISPs, the rights holders themselves or the relevant CMOs which may have greater bargaining power in the development of such processes.

[24] Section 35B (2) (h), Copyright (Amendment) Act (2019)


Image by brgfx on Freepik

Leave a Comment

Your email address will not be published. Required fields are marked