szukaj

Joint Association Letter on Artificial Intelligence Act

Warsaw, 27th December 2023

Joint Association Letter on Artificial Intelligence Act

We, the undersigned organizations, appreciate the opportunity to share our perspective on the upcoming crucial negotiations during the fifth round of trilogues on the Artificial Intelligence Act (AIA). We would also like to express our gratitude to EU policymakers for their work on AIA regulation – our community supports its overarching goals of increasing Europeans’ trust in artificial intelligence and advancing innovations created within EU Member States. However, as the negotiations are nearing their end, we remain concerned about the new proposals that undermine the original risk-based approach by introducing a multi-tiered approach.

We strongly encourage trilogue negotiators to avoid asymmetric regulations and a multi-tier framework towards foundation models (FM) and general-purpose AI (GPAI). Asymmetric obligations inappropriately target only selected vendors and certain FM and GPAI models, irrespective of actual risk or use case of a given AI application. As the Computer & Communications Industry Associations (CCIA Europe) points out, arbitrary classification based on size criteria, such as the number of FM and GPAI users and the amount of computing data used to train them, can effectively stop companies from further expanding and scaling their services.[1] Furthermore, attempts to add new copyright requirements in the AI Act despite the existing comprehensive EU copyright framework will only create additional complexity and should not be addressed in the AI Act, which is part of EU product safety legislation. These changes would run counter to AIA’s original risk-based approach, which has been a result of a comprehensive consultation process and struck a fine balance between enabling innovation and protecting users.

Furthermore, we are also concerned about proposals to expand the list of high-risk system use cases in Annex III and the list of prohibitions in Article 5. We strongly encourage negotiators to clarify that the latest Article 6 and Annex III compromise text does not not label all profiling systems as high-risk, but only limits the scope of the exemption from the high-risk classification. Without such a clarification, the existing language could potentially result in an excessively wide high-risk classification in Annex III. Associating profiling with a broad negative impact on fundamental rights would also contradict the GDPR. As a result, positive use cases that benefit users – such as increasing accessibility or identifying biased data sets – could be banned.

GDPR. As a result, positive use cases that benefit users – such as increasing accessibility or identifying biased data sets – could be banned.

In brief, while we support the overarching goals of the AIA, the current form of the legislation may hinder the emergence of innovation and impede the development of artificial intelligence in the EU, leading to decreasing competitiveness of Europe in times of rising global instabilities.

***

1 CCIA Europe letter on AI Asymmetric Regulation – CCIA (ccianet.org)

Signatories:

o Center for Data Innovation
o Computer & Communications Industry Association Europe
o Danish Entrepreneurs
o Digital Poland
o European Enterprise Alliance
o European Union and International Relations of Infobalt
o NL Digital
o SAPIE
o Union of Entrepreneurs and Employers Poland

 

See more: Joint Association Letter on Artificial Intelligence Act

ZPP co-signed industry joint letter on the upcoming revision of the EU liability framework


Dear Commissioner Breton,
Dear Commissioner Reynders,


Our associations represent a broad coalition of startups, SMEs, and technology companies. We are writing to you in the context of the revision of the existing Directive on the liability for defective products (PLD) and the proposal for a directive to adapt liability rules to Artificial Intelligence (AI Liability Directive). We support the underlying objective to ensure a high level of legal certainty for companies and trust for consumers. We therefore request that the European Commission strive to ensure that the PLD and the AI Liability Directive are balanced and proportionate for all stakeholders, in conjunction with the applicable existing and future legislation. As such we take the liberty to make a few preliminary recommendations:

1. The definition of products should remain fit for purpose

The existing PLD is technology-neutral and already applies to all unsafe products, including those with embedded software. The current PLD is complemented by national tort and contract laws. Damages due to defects that occurred after a product has been put into circulation are
therefore already covered by national legislation. The definition of ‘product’ doesn’t need to be expanded to include intangible products (e.g.
digital content and standalone software). Instead, the definition should remain technology-neutral and future-proof. Applying strict liability would be disproportionate and ill-suited to the properties of software. This includes that standalone software and software errors cannot physically act upon any person or physical property and would therefore not cause personal injury or property damage, that bugs are an inherent feature of software development, and that there is no fixed state when software is “put into circulation” given that software evolves and improves over time. Moreover, software updates are commonly used to extend the lives of digital products and address any software errors. Extending strict liability to software updates could disincentive software development and maintenance. This would also conflict with EU efforts to encourage sustainability in the circular economy.

2. The scope of damages should not include immaterial damages

Extending the range of damages to non-material damages (e.g. privacy infringements or psychological harm) would significantly increase legal uncertainty over other pieces of legislation that already cover non-material damages (e.g. GDPR). Providing a separate and
potentially overlapping basis for compensation would cause confusion, and could eventually lead to forum shopping and double claims for a single harm. Applying strict liability would put a disproportionate burden on providers as non-material damages are less predictable and more complex to quantify than material damages. This could have a chilling effect on innovation, and/or materially increase the price of software for end-users, and could potentially hinder the uptake of useful advanced software applications, including AI, by the market.

3. No need to reverse the burden of proof on all AI applications and software

We are not aware of evidence that shows that the burden of proof of the current PLD places consumers at a disadvantage. Reducing or reversing the burden of proof is a tool that should only be considered for very specific cases, motivated by the profile of harm and take into
account the degree of opacity of a particular product. The regulatory response to new technologies should not be generalizing the worst-case scenarios or specific situations. A one-size-fits-all rule for all AI applications would become an excessive burden on AI developers and users, significantly hamper innovation and affect the rollout and take-up of AI technologies in the EU.

4. Strict liability for online marketplaces is not appropriate

Consistent with recent legislative initiatives such as the Digital Services Act and the proposed General Product Safety Regulation, we encourage policymakers to only regulate intermediating online marketplaces in a way that recognises their nature and does not undermine the operation of particular business models as this could have a negative impact on innovation and consumer choice in the EU. It is also worth noting that companies which operate an online marketplace as a hybrid business model (e.g. combining manufacturing and intermediating between traders and consumers) already fall within the scope of the current PLD on these activities.

Filling the liability gap identified in the consultation process from third-country manufacturers can be done more appropriately through other means, e.g. putting the liability onto the authorised representative of a manufacturer or the responsible person for products placed on
the European market.

As the European Commission drafts its proposals to review this framework, we would like to emphasise our availability and willingness to work towards a workable and balanced approach. We thank you for your consideration and remain at your disposal to provide additional
information.

Signatories (in alphabetical order):

ACT – The App Association
Computer & Communications Industry Association (CCIA Europe)
Confederation of Industry of the Czech Republic
Developers Alliance
DOT Europe
Infobalt
Information Technology Industry Council (ITI)
SAPIE (Slovak Alliance for Innovation Economy)
Związek Przedsiębiorców i Pracodawców / Union of Entrepreneurs and Employers (ZPP)

 

See: Joint Industry Letter on the PLD and AI Directive

Joint Letter on Digital Services Act

We are writing to you on behalf of some of the most innovative tech startups and scaleups who are leading the charge to make Europe one of the leading digital economies in the world.

We have been following the negotiations on the Digital Services Act (DSA) closely and are pleased to see the progress that has been made so far on legislation that will define Europe’s digital economy for decades to come. However, we remain concerned that some of the proposals under discussion may run counter to the ambitious and innovative digital goals Europe has, and we urge them to consider the following:

  • Enable businesses to continue to use targeted advertising as a means to connect with customers across the single market. Targeted advertising is a crucial tool for businesses looking to grow, innovate and sustain themselves and moves to heavily restrict it will harm their competitiveness.

  • Ensure that online marketplaces are not weighed down by disproportionate obligations. Online marketplaces are critical for ensuring a vibrant and innovative digital European economy. Requiring online marketplaces to undertake random checks of traders’ goods while carving out identification rules for SME traders may incentivise marketplaces to remove some of them off the platform due to the riskiness of hosting some traders. This will hinder further innovation and growth.

  • Ensure user redress systems are not overburdened. The high level of obligations required by the DSA will mean that intermediary service redress mechanisms will be overwhelmed. If a user is to receive a notification every time there is an action taken that impacts on visibility, ranking or demotion, there will be an unmanageable and overwhelming level of notifications for users to receive and intermediary services to manage. The most effective user redress mechanisms are those that are quick, user friendly and efficient. Additionally, policymakers should refrain from expanding redress for users flagging content under a platform’s Terms and Conditions. Platforms receive an enormous number of unfounded and erroneous user flags every day. To allow expansive redress for these flags is more than disproportionate; it could impair user safety by requiring online platforms to divert resources to dealing with groundless appeals.

The DSA is a significant change in Europe’s digital rules and will reverberate for many years to come. As we enter the final stretch of the negotiations, we are confident that policymakers will find the right balance to achieve an ambitious and innovative digital future for all.

Signatories:
– Digital Future for Europe
– Developers Alliance
– ZPP Poland
– Confederation of Industry of the Czech Republic
– Finnish Federation for Communications and Teleinformatics
– Digital Poland

www.digitalfutureforeurope.com

 

See more: Joint Letter on Digital Services Act

CDA joint letter on EDPB guidance legal consistency with DSA

Brussels, 28 April 2022

 

European Data Protection Board

As a Coalition of Digital Ads (CDA) of SMEs we appreciate the European Data Protection Board’s efforts to bring greater clarity and awareness of how social media platform interfaces are designed. We believe that manipulative practices which do not respect GDPR and which hinder the ability of users to effectively protect personal data and make conscious choices should be minimised. All of those goals should be achieved while avoiding legal uncertainty and mixed signals.

What is paramount is that SMEs get clarity on how practices are associated with targeted advertising and “dark patterns”. While the guidance provided by EDPB provides a helpful guide on identifying and avoiding dark patterns in social media platform interfaces, it still does not provide a precise definition nor an exhaustive list.

While the EDPB’s guidelines are intended for GDPR compliances, the Digital Services Act (DSA) also addresses dark patterns. Any additional guidelines or regulations will have to be aligned with the DSA wording for clarity. Similarly, the Unfair Commercial Practices Directive (UCPD) must be considered which regulates dark patterns for consumer protection. It will therefore be important to ensure the text is precise in its description of dark patterns. Ideally this should specify that the term refers to manipulative design choices that materially distort the behaviour of an average user. However this should not lead to an outright ban on ads practices, which may be justified in some
circumstances.

The EDPB guidance similarly needs to ensure consistency with the DSA text and the relationship between the two should be clearly outlined. With proposed guidelines there is a risk of creating even more incompatibility between the various European regulations on dark patterns and further complications regarding compliance and execution. The DSA’s definition of “compliance by design” for online marketplaces might also interfere with the outcomes of the proposed guidelines.

Almost all small businesses in Europe depend on digital channels to find new audiences, market to them and convert them to customers. European economic integration is dependent on the ability of SMEs to expand, grow and ultimately reach consumers throughout Europe. However, unlike large corporations, SMEs do not have the resources for large-scale marketing campaigns reliant on organic tools. What SMEs need is legal coherence, clarity and certainty to know what practices they must avoid and what they may utilise. Guidance must be clear on the outlined issues as legal expertise is a costly expense for SMEs if they are to navigate the wealth of different regulations
addressing dark patterns.

We hope that the voice of SMEs will be reflected in any upcoming digital communications regulations and guidelines. We remain open for further engagement in the process.

 

Co-signatories

 

The Coalition for Digital Ads (CDA) of SMEs supports thousands of SMEs that power Europe’s economy. Established by members of SME Connect in November 2021, CDA gives a voice to concerns over the EU draft proposals to initiate restrictions on personalised digital advertising across the EU and the impact a ban could have on SMEs, thus providing a necessary balance in an important debate. More about CDA: https://www.smeconnect.eu/cda/

For members of the ZPP

Our websites

Subscribe to our newsletter