KG LEGAL \ INFO
BLOG

Demonetization of website content in view of contract execution and AI algorithms

Publication date: January 06, 2025

Modern internet platforms, such as YT, have revolutionized the way we create and consume video content. Example: the American platform YouTube, with over 2 billion monthly active users, a significant portion of whom, in Poland for example, are increasingly professional journalists.

Such a service is therefore another source of distribution of video content created by professionals, including materials that earn money from advertising. Creators who share their videos on the platform therefore count on income from the so-called content traffic and popularity of the content as well as from displayed ads. Platforms that were originally formed to create the freedom to share, edit, broadcast live and comment on videos for free have now become a place of conflict of interest between preventive content security measures and freedom of press activity.

Demonetization, or the loss of the ability to monetize a video, is one of the most common problems encountered by creators. According to information shared by the creators themselves, decisions on demonetization are made largely by artificial intelligence (AI) algorithms that analyze video content for compliance with the platforms’ advertising policies. Unfortunately, these algorithms are not without flaws, which sometimes lead to subjectively questionable decisions to block so-called monetization, or the status of video material that complies with the standard rules of functioning on such a platform.

Such an assessment of film material and problems related to portal users are an interesting phenomenon from the perspective of the principles and consequences of proper performance of the contract. It is an open question whether and how such relations could be assessed as improper performance of the contract by Internet platforms and what legal consequences may result from such errors.

What does the decision-making process of an AI algorithm of online platforms look like?

The AI algorithms used by online platforms to detect violations of their advertising policies are considerably complex. These systems analyze both audio and video content, looking for elements that could be considered a violation of the platform’s community guidelines or monetization policies. This could include content promoting violence, hate, disinformation, or other controversial topics. The algorithms are programmed to identify such elements based on specific patterns, allowing them to automatically rate video content in real time. While in theory, such a system aims to ensure fairness and eliminate content that violates the platform’s policies, in practice, there are numerous errors that result in unfair demonetization. An example would be when a video containing controversial but legal material is demonetized, even though it does not violate the policies of platforms such as YouTube. The potential for error arises from the fact that AI algorithms can have difficulty interpreting context or recognizing nuances in the video content. As a result, the algorithms’ decisions can be flawed. This, in turn, could constitute a potential breach of contract between the creator and such platform.

How should Artificial Intelligence be defined?

There is no uniform definition of artificial intelligence in both the Polish and European legal systems, but work is currently underway on a draft resolution of the European Parliament with recommendations to the Commission on the civil liability regime for artificial intelligence (2020/2014(INL), P9_TA(2020)0276). For this very reason, the European Parliament has tried to make this attempt, namely define that an artificial intelligence system is a system based on software or built into devices that exhibits behavior simulating intelligence. It operates, among other things, by collecting and processing data, analyzing the environment and taking action in an autonomous manner, aimed at achieving specific goals.

Film Demonetization – Is It Legal?

The question is open and is related, among other things, to an individual assessment of the operation of not only the Polish principles of the so-called “proper” performance of the video content publication agreement, but also of the legal status itself as a so-called “press material”.

In accordance with art. 3 sec. 5 of the Act on the principles of participation of foreign entrepreneurs and other foreign persons in economic turnover on the territory of the Republic of Poland (Journal of Laws 2022.470, i.e. from 25 February 2022), the entity administering the internet platform theoretically meets the statutory definition of a foreign person [specifically, being usually a legal person with its registered office abroad]. Often, the operator of such a portal does not have a separate branch in Poland. However, there are exceptions. For example, the popular YouTube portal on the Polish market is managed by Google Poland sp. z o.o., which, based on the above regulation, objectively has its registered office in Warsaw.

In turn, when it comes to the classification of video materials shared by journalists, it can be stated that, in accordance with Art. 1 in connection with Art. 7 of the Press Law, the press also includes all existing and emerging as a result of technological progress mass media, including broadcasting stations and company television and radio stations, disseminating periodical publications by means of print, vision, sound or other dissemination techniques; the press also includes teams of people and individuals engaged in journalistic activities. The key in this matter seems to be the judgment of the Supreme Court of 24 November 2017, I CSK 73/17, in which the court indicated that under the Polish Press Law, the press includes not only newspapers and magazines, but also “all existing and emerging as a result of technological progress periodical publications by means of print, vision, sound or other dissemination techniques“. It is worth emphasizing that although a given content is disseminated via the Internet, it is irrelevant to its legal classification as press. This does not mean, however, that every blog should be considered a press and its creator should register it immediately. The decisive factor here is the function that a given blog fulfills, not the form of communication.

The legal status of the creator of the material on the platform depends largely on whether such cooperation with the portal fits into the framework of press law regulations. It is important that not every creator of the material fits into the statutory definition of a journalist. Polish press law rigidly and precisely distinguishes in its legal definitions not only issues that are important from the perspective of the platform itself, such as the press, magazine or press material. Polish regulation specifically defines the status of a journalist, editor or editor-in-chief, and only this status and all the above elements together objectively create the possibility of using all the press protection mechanisms created by press law. Therefore, it should be pointed out that in the vast majority of creators, the model of cooperation with the Internet platform deviates from all the requirements and premises to be subject to protection under press law, and this in turn creates a greater scope for freedom of principles of publishing video material created unilaterally by the operator of such a platform.

Therefore, the regulations of online platforms are nothing more than a type of contract execution rules, and this is a contract that is not negotiated in principle. One party indicates the terms of cooperation in the document they have created, and the other – if they agree to these terms – must accept them. Acceptance of the regulations is most often tantamount to concluding an agreement with the content specified therein. It can be treated as an element of a civil contract concluded with the platform user, the terms of which do not have to be contrary to the applicable provisions of Polish law. This means that a platform that establishes its rules globally should not shape the regulations in a way that may violate the principles of the legal order in force in Poland, including the provisions of the Civil Code. In the case of journalists who run journalistic accounts but on online platforms, it is controversial whether they are subject to the provisions of press law at all. Press law in Poland provides journalists with special protection regarding their work (including freedom of speech, the right to publish). It is therefore an open question whether the regulations of such a platform, even if they are a civil contract, cannot limit these rights in a way that is contrary to national law. Only journalists, in fact, should enjoy protection under press law if their materials are journalistic, informative or opinion-forming in nature in accordance with their professional profile and are published within the system defined in press law, whose cooperation with platforms does not always meet these requirements. Therefore, in the event that an Internet platform restricts the possibility of publishing material in accordance with the provisions of press law, the question arises whether a journalist could invoke non-performance of a civil contract on the basis of the provisions of the Civil Code. In such cases, the content creator may only theoretically pursue claims based on improper performance of the contract and demand compensation if the platform does not fulfill its obligations under the contract (e.g. by blocking or demonetizing content that is in accordance with the law and the regulations).

What are the consequences of an incorrect AI algorithm decision?

removal of the film based on the AI algorithm can be compared with the Act on the provision of services by electronic means (Article 13). Blocking or possible removal of content in the material concerns the modification of data. According to this provision, in some cases the liability of service providers providing caching services can be excluded, because the person who, by transmitting the data and ensuring automatic and short-term intermediate storage of this data in order to speed up re-access to it at the request of another entity, does not modify the data; nor uses recognized and usually used in this type of activity IT techniques defining technical parameters of access to data and does not interfere with the use of recognized and usually used IT techniques in this type of activity in the scope of collecting information on the use of the collected data, is not liable for the stored data.

Caching (from English “cache” – cache memory) is a technique used in computer science and technology to speed up access to data by storing the most frequently used information in a faster memory source. The main goal of caching is to reduce the latency of access to data, which in turn improves the performance of computer systems, applications, or websites.

The provision in question is an implementation of Article 13 of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce). In addition, the European Union, in its latest regulation on artificial intelligence, introduces a requirement for artificial intelligence systems, especially those with high risk, to be transparent and accountable for their decisions. For internet platforms, this means that the algorithms used in the platform (e.g. recommendation algorithms, content moderation, video demonetisation) may require greater transparency. Platforms may therefore be required to:

– explaining to users how algorithms work: for example, how content recommendation algorithms or movie rating algorithms work,

– giving users more control, which could mean that in some cases users could be given the ability to influence how AI chooses the content it recommends to them.

The AI Act introduces provisions aimed at accountability for decisions made by AI, especially in the case of high-risk systems. Online platforms using AI to moderate content, assess violations of regulations or detect plagiarism may therefore be required to:

– Liability for algorithm errors; if platforms’ algorithms make a mistake (e.g. wrongly demonetize material or remove it without justification), platforms may be required to take responsibility and provide redress mechanisms that enable users to rectify the situation.

– Implementing transparency in automated processes: in the case of decisions made automatically by AI (e.g. video demonetization, account blocking), online platforms may be forced to provide more detailed justification for these decisions, which may reduce the number of false positives.

Legal solutions

The rapid development of artificial intelligence and the threats associated with it constitute a global problem. It is therefore not surprising that the European Union has undertaken to regulate this area. Although the relevant provisions have not yet entered into force, a draft resolution with recommendations to the Commission on a civil liability regime for artificial intelligence (2020/2014(INL)) can be found on the European Parliament website. This document includes, among other things, the framework for a future regulation on liability for damage caused by AI and the justification for this legal act. It also contains the opinions of the Committee on the Internal Market and Consumer Protection on this regulation. The regulations provide an initial insight into what liability for the actions of artificial intelligence may look like in the near future. The lack of a clear legal framework, in the example discussed in this article regarding online platforms, shows how many discrepancies can arise in this matter. Algorithmic errors in the demonetization process may negatively affect creators. Introducing more precise algorithms, appeal systems and compensation for creators can help resolve issues related to improper contract performance and improve the fairness of the demonetization process.

UP