News
22/2/2021

Digital Services Act: the new proposal for a Regulation for platforms

Share this post
Blog authors
Silvia Martinelli
Strategic Research Manager
Sign up for the Data Valley newsletter
By clicking on “Sign Up” you consent to the processing of data according to our Privacy Policy.
Thank you, your subscription to the newsletter has been received!
An error occurred while submitting the form.

Taken from”The Legal Newspaper”, the legal information newspaper of the Wolters Kluwer Italia group and edited by Cedam, Utet Giuridica, Leggi d'Italia and Ipsoa.

On 15 December 2020, the European Commission presented a new proposal for a regulation on digital services and platform responsibility, called the 'Digital Services Act'. The proposed Regulation aims to redefine the regulations applicable to online platforms, amending Directive 31/2000 and introducing new provisions on transparency, reporting obligations and accountability for content moderation.

European Commission, Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (Digital Services Act), 15 December 2020

On 15 December 2020, the European Commission presented a new proposal for a regulation on digital services and platform responsibility, called the 'Digital Services Act'. The proposed Regulation aims to redefine the regulations applicable to online platforms, amending Directive 31/2000 and introducing new provisions on transparency, reporting obligations and accountability.

The proposal, part of the “Digital Services Act package”, the new package of proposals of the European Commission for the Digital Single Market, intervenes on Directive 31/2000, the so-called e-commerce directive, which also regulates the responsibility of the provider, subject to numerous conflicting jurisprudential and doctrinal issues, and which is now being adjusted to meet the needs that emerged from 2000 to today with the multiplication of online services and the redefinition of the market.

Added to the traditional notions of provider is that of “online platform”, defined as the provider of a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless such activity is a minor and purely ancillary feature of another service and, for objective and technical reasons, cannot be used without that other service and the integration of the service is not only a means of circumventing the applicability of the Regulation. European Commission, Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (Digital Services Act), December 15, 2020.

The proposal maintains the original structure of Directive 31/2000, i.e. the rule according to which the platform is not required to monitor the content posted on the platform by users, but introduces new rules on transparency, reporting obligations and accountability, largely reflecting the guidelines and jurisprudential guidelines that have emerged over the years, now proposed as a European regulation for greater harmonization and legal certainty.

In particular, the proposed Regulation establishes:

(a) a framework for exempting brokerage service providers from liability;

(b) due diligence obligations for certain specific categories of brokerage service providers;

(c) new rules for implementation, enforcement, cooperation and coordination between Member States on digital services.

By limiting the contribution to aspects concerning platform liability and the changes that the Regulation would introduce, the two fundamental rules already contained in Directive 31/2000 regarding hosting provider liability remain.

The provider is not responsible for information stored at the request of a service recipient provided that the provider:

(a) has no actual knowledge of illegal activities or illegal content and, with regard to claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is evident;

(b) after obtaining such knowledge or awareness, act quickly to remove or disable access to illegal content.

Instead, the provider must take steps to remove illegal content or provide information when requested by an administrative or judicial authority.

However, it is specified that the exemption rule does not apply if the recipient of the service acts under the authority or control of the provider, nor in matters of liability deriving from the consumer protection regulations for online platforms that allow consumers to conclude remote contracts with professionals, where that online platform presents the specific information or otherwise allows the specific transaction in question in such a way that an average and reasonably informed consumer to believe that the information, or the product or service subject to the transaction, are provided by the online platform itself or by a recipient of the service acting under its authority or control.

Furthermore, although there is no general obligation to monitor and monitor, nor to actively search for facts or circumstances that indicate illegal activity, providers can carry out voluntary investigations on their own initiative or other activities aimed at detecting, identifying and removing or disabling access to illegal content, without this resulting in the loss of the benefit of exemption from the obligation of surveillance, as already stated in previous community guidelines.

Therefore, the original system has been reconfirmed, as it had been interpreted over the years, but with greater specification of the exceptions, following further provisions that introduce:

— a detailed regulation on transparency and reporting obligations for all intermediary services providers;

— some additional provisions applicable only to hosting services, concerning notice-and-take-down mechanisms; — some provisions applicable only to “online platforms” that are not SMEs;

— provisions applicable only to 'very large platforms'.

Within the first group of provisions, there is the obligation to establish a single electronic point of contact for dialogue with the authorities of the member states and the obligation to appoint a legal representative in the territory of the Union for providers established outside the EU that offer services to citizens of the Union; transparency obligations on restrictions on the use of the service, on content moderation and on algorithmic decisionmaking; reporting obligations with respect to content moderation, to requests received from Member State Authorities, as well as on complaint mechanisms for the subjects required to prepare them.

As for the additional measures for hosting, a detailed and uniform regulation has been introduced regarding notice-and-take-down mechanisms, mandatory to allow users to report illegal content. The reporting and decision procedure is regulated and provides, in particular, for the reason for the removal decision, with an indication of the information that this motivation must contain.

For online platforms, it is mandatory to introduce an internal complaint mechanism with respect to platform decisions concerning the removal of content, the suspension or termination of the service, the suspension or removal of the user's account. The decision taken in the internal dispute resolution mechanism may then be subject to further judgment both using traditional means and through “out-of-court dispute settlement”, impartial and independent and certified by the Member State's Digital Services Coordinator.

In addition, measures have been introduced with regard to “trusted flaggers” for the reporting of content, with regard to the suspension of accounts that frequently publish illegal content, with respect to reporting to the authorities to which the platforms are required, as well as with regard to the traceability of users who offer goods or services on the platform and in the matter of online advertisements.

Finally, for “very large platforms” only, a risk assessment obligation has been introduced with respect to the risks deriving from the operation of the service, concerning, in particular: content moderation, the protection of fundamental rights, the manipulation of the service. The platform will, therefore, be required to implement reasonable, proportionate and effective measures to mitigate the risk detected, with an obligation to audit. There are also additional obligations regarding recommender systems, online advertising, data access. The figure of the compliance officer is introduced to monitor these aspects.

The Commission will support and promote the implementation of standards, codes of conduct and protocols for adaptation.

Are you ready to transform the Data in value for your business?