szukaj

What's new

Memorandum on the Digital Services Act



Warsaw, 17th September 2021

 

Memorandum on the Digital Services Act

 

Introduction

The Internet as we know it today has largely been shaped by the Directive on electronic commerce adopted in 2000, also known as the e-Commerce Directive. Since then, online platforms have been brought to life, e-commerce has developed, and social media have emerged, and along with them new challenges related to how the Internet is used. To respond to these challenges, the European Commission has recently launched a number of legislative initiatives, including the Digital Services Act, referred to as DSA. A draft of the DSA was submitted by the Commission on 15th December 2020 and the negotiation period began. On 3rd September 2021, the Slovenian Presidency of the Council of the European Union presented a compromise text on which the following analysis is based.

In the foreseeable future, the DSA will amend the e-Commerce Directive and thus modernise the regulatory framework for the various Internet service providers. The Act can guarantee competition in digital markets under better conditions and may support business growth. At the same time, it is a major reform in terms of moderation and removal of illegal content on the Internet and introducing far-reaching changes to the way all actors on digital market operate.

In our view, the draft of the DSA requires certain modifications that will increase legal certainty, reduce disproportionate burdens on enterprises, and will actually support smaller entities. Before we move on to selected aspects of the DSA, however, we will place this new regulation in light of a broader trend to regulate the digital world. Next, we will discuss in this memorandum the three elements of the Act. First of all, we would like to point out that the DSA is not introducing any of the changes necessary to increase legal certainty in the context of the conditional exemptions from liability established by the e-Commerce Directive. Secondly, we will show that excessively detailed obligations in the context of reporting and removal of illegal content will worsen the ability of smaller players to compete. Thirdly, we will draw attention of the fact that the radical changes in digital advertising proposed by the European Parliament will have a negative impact on European consumers and entrepreneurs.

  1. DSA in the context of digital regulations

An appropriate regulation of the digital economy is a priority on the agenda of the world’s largest organisations, including the World Trade Organization, the International Monetary Fund, the World Bank, the Organisation for Economic Cooperation and Development, and the European Union[1].

Over the past few years, many proposals have been made at the level of the European Union, aimed at regulating enterprises operating in the broadly understood digital world. The DSA is just one example of new regulations, including the GDPR, the P2B Regulation, the Regulation on preventing the dissemination of terrorist content online (Terrorist Content Online Regulation, or TCO), the Digital Markets Act (DMA), and the Directive on digital services tax.

A large number of new regulations may lead to the risk of conflicts between legal acts. First, the terms and conditions for the removal of illegal content are elaborated on in the DSA, the TCO as well as the Directive on Copyright and the Regulation on explosives precursors. While the last two pieces of legislation have a narrower scope than the DSA, the essence of all three is the regulation of illegal online content and, importantly, all three establish different obligations and liability thresholds for Internet service providers (ISPs). Secondly, the relationship between platforms and entrepreneurs is a subject of not only the DSA, but also the P2B Regulation and the DMA. Above all, new regulations – the DSA as well as the DMA – were proposed before it was possible to thoroughly assess the effects of implementing P2B.

Furthermore, the multitude of new regulations is a source of definition-related and procedural difficulties, which consequently reduces legal certainty. For companies active in the digital sector, this translates into the need to incur high costs related to compliance with the new rules. Paradoxically, however, such a large number of new regulations may become a barrier to entry and growth for European SMEs, while large foreign entities will easily adapt their business models to a new reality.

  1. Conditional exemptions from liability for illegal content

One of the most important principles introduced by the DSA is the mechanism of conditional exemptions from liability for user-published content to be granted to providers of intermediary services. However, since the adoption of the old directive in 2000, new digital services emerged that have changed the way communicate, connect, consume and do business, and with them, doubts have arisen over the application of the directive. The European Commission has therefore committed itself to updating existing legislation[2].

The following section will analyse whether the draft DSA adequately addresses the interpretation problems regarding the liability exemptions established under the e-Commerce Directive. To this end, the following topics will be discussed: (i) terms and conditions of exemptions from liability for user-published content to be granted to providers of intermediary services; (ii) case-law of the Court of Justice of the European Union in this respect; (iii) changes in the scope of liability of providers of intermediary services in the draft DSA.

  • Terms and conditions of exemptions from liability for user-published content to be granted to providers of intermediary services

Articles 12-14 of the e-Commerce Directive provide for conditional liability exemptions (so called “safe harbours”) for three types of intermediation services: mere conduit, caching, and hosting[3]. Moreover, Art. 15 of said directive prohibits the imposition of a general obligation to monitor intermediaries.

Mere conduit

Pursuant to Art. 12 of this directive, mere conduit is a service consisting in the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network. Liability in this case will be avoided if the provider of the service: a) does not initiate the transmission; b) does not select the recipient of the transfer; and c) does not select or modify the information contained in the transmission. Art. 12 sec. 2 specifies that the activities consisting in transmission and providing access specified in sec. 1 include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission. The conditional limitation of liability contained in this article, however, does not affect the possibility of a court or administrative authority of requiring the service provider to terminate or prevent an infringement

Caching

Art. 13 of the directive regulates “caching”, a service consisting in the transmission in a communication network of information provided by a recipient of the service. The service provider will not be liable for the automatic, intermediate and transient storage of the information transmitted performed for the sole purpose of making more efficient the information’s onward transmission to other recipients of the service upon their request, on condition that the service provider: a) does not modify the information; (b) complies with conditions on access to the information; (c) complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; (d) does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and (e) acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.

Hosting

Article 14 of the e-Commerce Directive defines hosting as a service consisting of the storage of information provided by a recipient of the service. The service provider will not be responsible for the information stored at the request of a recipient of the service, only if the service provider: (a) does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or; (b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.

No general obligation to monitor

The last important element of the mechanism of conditional exemption is the absence of a general obligation to monitor established under Art. 15. Pursuant to this article, service providers are not required to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. Such an injunction would risk excessive control over content and the removal of content that is legal but controversial, thereby censoring the Internet and hindering the freedom of speech.

At the same time, it should be remembered that neither conditional exemptions, nor the absence of a general obligation to monitor prevent intermediaries from being required to undertake appropriate steps against infringement of third party rights – by virtue of court orders or due diligence obligations[4].

  • case-law of the Court of Justice of the European Union in the scope liability for user-published content to be granted to providers of intermediary services

There is widespread consensus that the conditional exemption mechanism is incomplete and presents a number of problems in its present form[5]. The following section discusses selected issues in this area, including: a subjective scope of application; the distinction between “active” and “passive” intermediaries; the effects of proactive measures; and the knowledge or awareness threshold for a conditional exemption.

Subjective scope of application

There are certain uncertainties as to the subjective scope of applying “safe harbours”. As has been noted, the conditional liability exemptions from the e-Commerce Directive do not apply to services provided by all intermediaries, but only to services that qualify as “information society services”[6]. Thus, this concept is a condition that determines the subjective scope of application of these safe harbours and shows the incomplete nature of the directive[7].

The Directive on electronic commerce does not include a definition of an information society service[8]. The definition in earlier Community legislation covers all services normally provided for remuneration, at a distance, by means of electronic devices for processing (including digital compression) and storage of data, at the individual request of the recipient. At the same time, recital 18 of the e-Commerce Directive states that “information society services are not solely restricted to services giving rise to on-line contracting but also, in so far as they represent an economic activity, extend to services which are not remunerated”.

In its case law, the CJEU applied Art. 14 of the directive to a search engine advertising service[9], sales on an online platform[10] and a social media platform[11]. Art. 12 of the directive has been applied to an ISP[12] and to an open Wi-Fi access service provider[13].

At the same time, the CJEU refused to qualify Uber’s services as those of an information society, thus negatively delimiting this definition[14]. In the Court’s opinion, a company providing a smartphone application that mediates between a passenger and a non-professional driver in booking a journey constitutes a transport service[15].

The distinction between “active” and “passive” intermediaries

The distinction between active and passive intermediaries is crucial from the perspective of service providers: passive intermediaries qualify for a conditional exemption from liability, but active intermediaries forfeit this privilege[16].

In some cases, determining whether a platform’s activity is active or passive is simple. The Papasavvas case of an online journal publishing company is a clear example of an active platform[17]. In the opinion of the CJEU, the company as a rule “has knowledge of the information posted and exercises control over that information”, and therefore cannot apply for conditional exemption from liability[18]. A good example of a passive intermediary can also be found in the case of Netlog, where the Court found that a social platform that stores information provided by users on its servers can make use of the “safe harbour”[19].

There is, however, considerable uncertainty as to the extent to which activities such as ranking building, indexing, sharing review systems, managing infrastructure and content hosted by platforms are actual control and thus an active intermediary role[20].

In this context, the connection to copyright is particularly important. The GS Media case concerned liability for linking to unauthorised content. The Court then held that “when the posting of hyperlinks is carried out for profit, it can be expected that the person who posted such a link carries out the necessary checks to ensure that the work concerned is not illegally published on the website to which those hyperlinks lead”[21]. Thus, the CJEU established a rebuttable presumption, which depends on whether or not the links were made for profit [22].

The effects of proactive measures

The e-Commerce Directive, as well as other legal instruments, urge intermediaries to step up their efforts to combat illegal or harmful content[23]. However, taking into account the aforementioned distinction between active and passive intermediaries and the lack of a general monitoring obligation under Art. 15 of the Directive, there are doubts as to what level of actions taken by platforms will not lead to the loss of the conditional exemption from liability[24].

A good example of this tension is the due diligence obligation in recital 48. It states that member  states have the right to require service providers “to apply duties of care, which can reasonably be expected from them and which are specified by national law, in order to detect and prevent certain types of illegal activities”.

The narrow understanding of due diligence indicates obligations “imposed by criminal or public law e.g. aid in investigation of crime or security matters, not as extending to duties under private law, e.g., to help prevent copyright infringement”[25]. In other words, due diligence may manifest itself in ex-post obligations, e.g. by removing content after gaining knowledge about its illegality, which, from the point of view of Art. 14 of the Directive, is not problematic, as well as in ex-ante obligations, i.e. measures that platforms must take before they become aware of the illegal nature of the content, which may result in the loss of conditional exemption from liability[26].

Nevertheless, member states retain the right to impose both types of obligations on platforms, and some ex-ante obligations are promoted by EU instruments, including “The EU Code of conduct on countering illegal hate speech online” of 2016 launched together with Facebook, Microsoft, Twitter and YouTube and signed by all these entities, the 2017 “Communication on Tackling Illegal Content Online” by the European Commission, and a Recommendation on measures aimed at the same goal[27].

Knowledge or awareness threshold for a conditional exemption

The case law of the Court of Justice of the European Union (CJEU) has provided criteria to determine when a service provider is aware of the illegal nature of an activity or information. The ruling in the L’Oréal case requires the interpretation of the principles set out in Art. 14 sec. 1 “as covering every situation in which the provider concerned becomes aware (…) of such facts or circumstances”. Thus, the Court emphasised that intermediaries may benefit from the liability exemption when they perform a purely technical, automatic, and passive role[28]. Despite the efforts of the European Court of Justice, in a very limited number of cases, many interventions or actions, especially in terms of content moderation, remain in the grey area[29].

  • Changes in the scope of liability of providers of intermediary services in the draft DSA

As stated in the Introduction, the purpose of the DSA is to update the Directive on electronic commerce. It could therefore be assumed that solutions to the above-mentioned doubts would be among the most important priorities of the authors of the project.

Unfortunately, Articles 3-5 of the DSA replicate Articles 12-14 of the e-Commerce Directive, thus preserving the key problems that have arisen around the intermediary’s liability for content[30]. It should also be noted that the DSA uses the method of asymmetric regulation and imposes additional obligations on various entities. Thus, new regulatory layers are created that will lead to new interpretative doubts in the future.

The DSA seems to solve two minor problems. Firstly, the act was presented in the form of a regulation and not a directive. This means that it will be applied directly to the national orders of individual member states, thus avoiding the fragmentation of the digital single market. Secondly, the DSA includes a new Art. 6 for intermediaries’ liability which governs voluntary own-initiative investigations. According to it, intermediary service providers will not lose the possibility of exemption from liability due to carrying out “voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content”, or because they “take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation”.

Even though the purpose of Art. 6 was to clarify the terms of liability, it raises a number of other doubts, especially in connection with additional obligations imposed on very large internet platforms.

At the same time, it is worth paying attention to the changes that were included in the compromise text presented by the Slovenian Presidency. Articles 3 and 4 remain unchanged; however, there are certain changes to Articles 5 and 6.

First of all, Art. 5 contains a new definition of online sales platforms, i.e. marketplaces. The very definition contained in Art. 2 (ia) defines marketplaces as online platforms that enable consumers to conclude distance contracts with entrepreneurs.

The proposed definition was intended to improve legal certainty, but leaves in our opinion much to be desired. A prerequisite for the recognition of a platform as a marketplace is the conclusion of a transaction between an entrepreneur and a consumer on a given platform. For example, if a consumer clicks on an advertisement for shoes on a newspaper’s website, it does not make the newspaper a marketplace. However, according to the currently proposed definition, this newspaper could be considered a marketplace, which would impose certain obligations on it.

Such a faultily constructed definition would be particularly severe in the face of an attempt by the European Parliament to introduce more stringent obligations for marketplaces. For example, one can recall the amendment introduced by MEP Alex Saliba regarding Art. 14a, according to which “marketplaces deserve special attention due to the large number of illegal activities detected on their Internet interfaces”, or additional provisions on marketplaces related to illegal offers proposed by, among others, French MEPs in Art. 22b. In our opinion, the above-mentioned amendments are attempts to differentiate responsibility due to the business model and constitute a departure from the general principle of asymmetric regulation based on the size of the enterprise. As a result, they can reduce legal certainty and increase the burden on entrepreneurs, and therefore should be assessed negatively.

Secondly, there is an amendment in recital 13 that extends the scope of hosting services to comments submitted to the platforms by users. In the compromise text proposed by the Slovenian Presidency, we read that “hosting comments on a social network should be considered an online platform service if it is clear that this is the main feature of the service offered, even if it is ancillary to the publishing of posts by service users”. In our opinion, such an amendment is technically impossible to implement. We draw attention to the fact that some user comments under live videos on online platforms are so-called live reactions. For this reason, it is not possible to monitor each transmission of this kind and investigate the users’ current reactions. Furthermore, the creators themselves usually have control over comments on their videos. They are free to disable comments on individual videos or approve comments that appear under their videos. Therefore, we evaluate the proposed change negatively.

Third, concepts such as acting in good faith and exercising due diligence have been introduced to Art. 6. However, we still do not know what is meant by due diligence. These amendments will therefore not increase legal certainty, but may allow platforms to better defend their interests in the event of a dispute by proving that they were acting in good faith.

Considering all the above-stated conclusions and taking into account the efforts of all parties involved in the works on the DSA, supporting this solution seems to be a squandered opportunity to simplify and systematise European law in this specific field.

  1. Changes to the system of reporting and removal of illegal content

Another expectation that the DSA was hoped to fulfil was the regulation of appropriate procedures for reporting and removal of illegal content. The purpose of introducing a new mechanism is to ensure a balance between the protection of freedom of speech and the protection of personal and intellectual property rights. We can therefore observe a similar tension as with the conditional exemption from liability of service providers for user-posted content and the absence of a general obligation to monitor content.

First of all, it should be noted that the European Commission in the draft DSA decided to propose the introduction of a notice-and-action mechanism. Thus, it gave up the narrower obligation to notify and remove illegal content (notice and take down) as well as the wider obligation of the intermediary to control, so that illegal content does not reappear (notice and stay down)[31].

Secondly, it should be noted that the reporting and removal procedure for illegal content only applies to illegal content, not to harmful content. So it can be seen that the DSA maintains the division used thus far in the e-Commerce Directive. Furthermore, Art. 2(g) of the compromise text of the DSA defines illegal content as “any information which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law”. Therefore, the DSA does not provide a definition of illegal content, leaving it a gesture of national law of the Member States[32].

However, this definition raises serious doubts, namely the “reference to an activity” that is illegal. Such an unclear definition may in practice mean that legal content describing illegal activities will be removed. As an example that already raises problems at this point, one can cite the descriptions of activities in war zones and the materials documenting these activities. Naturally, this gives rise to the risk of excessive removal of content posted on the Internet and has serious consequences not only from the perspective of freedom of speech, but also the protection of other values.

Thirdly, the DSA quite precisely defines the elements that should be included in a notice in order for it to be considered a credible message or source of knowledge on which the intermediary’s liability for the content depends within the meaning of Art. 5 of the DSA[33]. Accordingly, upon receipt of the communication, if an intermediary wishes to retain the possibility of conditional exemption from liability, it should immediately proceed to remove illegal content or prevent access to it.

Another doubt arises in this context related to the risk of excessive removal of content. The EC’s proposal required the users to explain the reasons why they consider the content to be illegal. The explanation of the reasons may turn out to be largely subjective in practice. The EC did not require the user to prove the unlawfulness, nor does it foresee any consequences for users for false reports (except for multiple reports in bad faith). In its previous form, the provision did not provide adequate protection to content creators (personal and intellectual property rights) or intermediaries (liability for illegal content). However, the Slovenian Presidency introduced an important amendment here, the so-called requirement that the justification for the illegality of content be “sufficiently substantiated”. While this is a step in the right direction, the rationale is still subject to subjective scrutiny and platforms do not have clear guidelines for their actions, which may create a risk of excessive removal of content.

Consequently, the EC proposal required the intermediary to perform certain actions immediately after receiving a notice, including: “the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity”[34]; “shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates”[35]; “providing information on the redress possibilities in respect of that decision”[36]. The DSA will apply horizontally to all types of online content. Given the multitude of different information and the potential nature of its unlawfulness, it cannot be reasonably expected that intermediaries will make all decisions at the same time. We therefore welcome the replacement by the Slovenian Presidency of “immediately” with “without undue delay”. In any event, burdening intermediaries with excessive liability would undoubtedly lead to excessive removal of legal content and infringement of the rights of their authors.

Ultimately, one should remember that the DSA obliges the intermediary not only to contact the notifier, but also the recipient. Art. 15 sec. 1 DSA states that the intermediary “shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision”. The statement of reasons is subject to specific requirements and is to include, among others, the following information:

  • whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access;
  • the facts and circumstances relied on in taking the decision;
  • where applicable, information on the use made of automated means in taking the decision;
  • where the decision concerns allegedly illegal content, a reference to the legal ground relied on;
  • where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on[37].

The DSA specifies that the statement of reasons must be sufficiently clear and understandable, and as detailed and precise as is reasonably possible under the circumstances. An additional burden imposed on service providers is the requirement to publish decisions and statements of reasons in an anonymised form in a database managed by the European Commission.

There is no doubt that such detailed and extensive requirements will be counterproductive with regard to the goal that DSA was intended to achieve; that is, reducing the competitive advantage of the largest technology companies and levelling the conditions of competition with larger entities. The largest enterprises in the IT industry have sufficient infrastructure and resources to cope with the requirements described above. However, these obligations will not only be imposed on very large online platforms, but also on smaller platforms. Complicated content monitoring systems will become a kind of entry barrier for new enterprises, which may strengthen the competitive advantage of largest players.

  1. Changes in the context of targeted advertising

Digital advertising plays an important role for Internet users, enterprises selling their products and services online, as well as platforms. It is not surprising, therefore, that it has become an object of interest for regulators. The third and final part of this memorandum examines the changes to digital advertising that have been proposed by the European Commission in the draft DSA, as well as by the European Parliament, namely the Committee on the Internal Market and Consumer Protection (IMCO) and the Committee on Civil Liberties, Justice and Home Affairs (LIBE).

The European Commission’s proposal

In its DSA proposal, the European Commission notes the important role that digital advertising plays in the online environment, as well as the risks it entails.

Interestingly, recital 52 of the preamble to the draft DSA mentions specific risks associated with digital advertising, i.e. advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In the face of these challenges, the EC proposes to make digital advertising more transparent so that users have the information they need to understand when and on whose behalf advertising is displayed. Service recipients should also have information on the main parameters used to determine whether a particular advertisement is to be displayed to them. Recital 63 of the preamble also requires very large online platforms to ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online.

IMCO report

A rather different approach was presented by the European Parliament. The IMCO report states that “pervasive collecting and use of users’ data to provide targeted, micro-targeted and behavioural advertising has spiralled out of control”. Furthermore, it finds the EC’s proposed new transparency requirements insufficient.

According to IMCO, ISPs should implicitly ensure that recipients of their services are not subjected to targeted, microtargeted and behavioural advertising unless the recipient has given voluntary, specific, informed and unambiguous consent. In other words, it proposes to introduce a default opt out and the need to obtain the consent of the data subject before processing personal data for targeted advertising.

LIBE opinion

The opinion presented by LIBE suggests that the DSA should provide for the right to use and pay for digital services anonymously. At the same time, LIBE advocates that digital and personalised advertising, for non-commercial and political advertising, be phased out. At the same time, behavioural advertising and personalised targeting in commercial advertising should only be possible if users have voluntarily consented to it, without risking exclusion from services, and without tiresome and repetitive consent banners.

Consequences

Proposals to increase transparency do not seem problematic, but actually desirable. Increasing the transparency of digital services is in the interest of users as well as intermediaries. However, it is apparent that the European Parliament is proposing a very different approach to digital advertising.

Indeed, if the amendment proposals submitted by IMCO and LIBE are included in the final text of the DSA, they will have a huge impact on the functioning of the online environment. It seems that the rapporteurs did not find the right balance between protecting users’ privacy and preserving the functionality or respecting the economic dimension of the Internet. It is primarily users who will suffer from the radical changes to digital advertising, because the model of free use of the Internet will be threatened. At the same time, European companies that sell their content and services online and creators who distribute their works in this manner will suffer. Significantly, for small and medium-sized enterprises, targeted advertising offers an opportunity to rationalise promotional costs. At the same time, the perspective of consumers is also important, for whom the lack of profiling in practice means an increase in spam.

Bearing in mind the above-mentioned arguments, we believe that the EU legislator should take a broader look at the issue of digital advertising, taking due account of the interests of all parties involved.

  1. Conclusions

The Digital Services Act will introduce multiple far-reaching changes to the way all digital market actors operate: the providers of online services, the enterprises that use these services as well as the users. The DSA therefore represents an opportunity to improve competitive conditions and improve the market position of European SMEs. Having analysed the selected changes, we are compelled to conclude that the DSA in its current form does not fulfil this potential. Moreover, many of the proposed changes may even be counterproductive to the goal of improving the competitive situation on the market and lead to the deterioration of the position of smaller entities and the creation of new barriers to entry and expansion. We see these threats from several sources.

First of all, the DSA reproduces the conditional exemption mechanism established in the e-Commerce Directive and with it all the uncertainties that have built up over the 20 years of this directive being in force. In the compromise text presented by the Slovenian Presidency, we find several amendments that could worsen the current situation. The proposed monitoring of comments is technically impossible, and will in any case lead to excessive deletion of content and a restriction of fundamental rights. Moreover, the flawed definition of marketplaces could lead to an unjustified extension of the obligations specific to sales platforms to providers of other online intermediary services. Such a definition would be particularly severe in light of the stricter rules that the European Parliament is attempting to introduce for marketplaces. The introduction of such stricter rules is undesirable, because it differentiates between service providers based on their business model and is a departure from the principle of asymmetric regulation adopted throughout the DSA.

We find minor changes in the DSA, such as clarifying that investigation on one’s own initiative and acting in good faith with due diligence do not result in the loss of the safe harbour. Relative to all the uncertainties that exist, such a change only marginally increases legal certainty. It is also important to remember that the DSA is simultaneously building new regulatory safeguards, which will lead to new interpretive uncertainties and disputes in the future. Therefore, we believe that in this aspect the DSA does not live up to its potential and the EU legislator should put more effort into increasing legal certainty for enterprises active in the digital world.

Secondly, the DSA imposes extremely extensive, yet vague, obligations on digital enterprises to report and remove illegal content. Meeting such requirements will require significant resources from ISPs. As large online platforms have both the infrastructure and capital to adapt to new requirements, the DSA may paradoxically empower very large online platforms and create new barriers to entry and expansion for smaller companies.

Thirdly and finally, the radical changes proposed by the European Parliament in the context of digital advertising seem to ignore the interests not only of European entrepreneurs, who largely base their business models on digital advertising, but also of users who, thanks to digital advertising, can access content for free. If the changes to the DSA are adopted in the form proposed by IMCO or LIBE, we can expect big losses for European entrepreneurs as well as for the users themselves.

 

***

 

[1] Daniil Petrovich Frolov and Anna Victorovna Lavrentyeva, Regulatory Policy for Digital Economy: Holistic Institutional Framework.

[2] Explanatory Memorandum for the draft DSA.

[3] European Commission, Hosting intermediary services and illegal content online – An analysis of the scope of article 14 ECD in light of developments in the online service landscape.

[4] European Commission, Hosting intermediary services and illegal content online – An analysis of the scope of article 14 ECD in light of developments in the online service landscape, page 28.

[5] Ibidem; Sartor, Providers Liability: From the eCommerce Directive to the future; de Streel, Larouche, An Integrated Regulatory Framework for Digital Networks and Services; de Streel, Defreyne, Jacquemin, Ledger, Michel, Innesti, Goubert, Ustowski, Online Platforms’ Moderation of Illegal Content Online; Schulte-Nolke, Ruffer, Nobrega, Wiewórowska-Domagalska, The legal framework for e-commerce in the Internal Market. State of play, remaining obstacles to the free movement of digital services and ways to improve the current situation.

[6] European Commission, Hosting intermediary services and illegal content online – An analysis of the scope of article 14 ECD in light of developments in the online service landscape.

[7] Ibidem.

[8] It is defined in Directive 98/34/EC of the European Parliament and of the Council of 22nd June 1998 laying down a procedure for the provision of information in the field of technical standards and regulations and of rules on Information Society services, and in Directive 98/84/EC of the European Parliament and of the Council of 20th November 1998 on the legal protection of services based on, or consisting of, conditional access.

[9] Judgment of 23rd March 2010, Google France, C-236/08.

[10] Judgment of 12th July 2011, L’Oréal, C-324/09.

[11] Judgment of 16th February 2012, Netlog, C-360/10

[12] Judgment of 13th January 2012, Scarlet Extended, C-70/10.

[13] Judgment of 28th October 2016, McFadden, C-484/14.

[14] Judgment of 20th December 2017, Uber Systems Spain SL, C-434/15; Judgment of 10th April 2018, Uber France SAS, C-320/16.

[15] Ibidem.

[16] European Commission, Hosting intermediary services and illegal content online – An analysis of the scope of article 14 ECD in light of developments in the online service landscape.

[17] Judgment of 31st October 2014, Papasavvas, C-291/13.

[18] Ibidem.

[19] Judgment of 16th February 2012, Netlog, C-360/10.

[20] European Parliament, Liability of online platforms.

[21] Judgment of 8th September 2016, GS Media, C-160/15.

[22] Ibidem.

[23] European Parliament, Liability of online platforms.

[24] Ibidem.

[25] Edwards, Downloading Torts: An English Introduction to On-Line Torts’ in Snijders and Weatherill.

[26] European Parliament, Liability of online platforms.

[27] Joan Barta, The Digital Services Act and the Reproduction of Old Confusions.

[28] Ibidem.

[29] Ibidem.

[30] Ibidem.

[31] TKP, Procedura notice & take action w projekcie Aktu o usługach cyfrowych (Digital Services Act) (Notice and take action procedure in the draft Digital Services Act).

[32] Ibidem.

[33] Art. 14 sec. 2 lists the following elements: (a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content; (b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; (c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; (d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

[34] Art. 14 sec. 4: “Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity.”

[35] Art. 14 sec. 5: “The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.”

[36] Ibidem.

[37] Art. 15 sec. 2 DSA.

 

See more: 17.09.2021 Memorandum on the Digital Services Act

For members of the ZPP

Our websites

Subscribe to our newsletter