blog Article

AI Projects & Digital Services Act: What CIOs Need to Keep in Mind

Author: Raphaël Peschi ,

With the upcoming Digital Services Act regulation (DSA), online platforms offering AI-driven services to end-users will have to be transparent about their algorithms. Does that mean that companies will be forced to make their AI-related intellectual property publicly available? No, the goal of the DSA is neither to discourage companies' incentives to innovate (on the contrary!), nor to forbid the use of AI. 

Transparency is, however, becoming a key focus. Companies using AI-driven recommender systems and targeted advertising will have to adapt to the new rules. In this article, we will help you understand the context and the impact on your company.

What is the DSA about?

DSA is a European regulation proposal that upgrades the e-Commerce directive of 2000. It defines a new set of rules governing digital services in the EU, with the core principle of “What is illegal offline is illegal online”. Together with the Digital Markets Act (DMA), the DSA aims at providing a safer online environment as well as harmonizing the diverging country-specific rules.

More specifically, the DSA focuses on:

  1. The traceability of content in order to identify and remove illegal content 
  2. Preventing misinformation and user manipulation by increasing the transparency of the recommender systems - i.e. algorithms that automatically determine the content to be displayed to a particular user based on the characteristics of that user.

On top of that, the DMA defines additional rules to prevent the big players from using unfair competition techniques.

Which companies are impacted?

The DSA and DMA will be applicable to all companies who act as providers of online services in the EU (whether or not they are based in the EU), including intermediary services such as hosting platforms (e.g. the servers on which websites are stored, or “hosted”. While companies owning those servers do not manage the hosted content, the DSA still foresees obligations for them. For example, BlueHost and SiteGround are website hosting platforms).

The DSA sets out different obligations for different actors, proportionate to the nature of their activity and their size. A hosting platform or a telecommunications company will, for example, have less regulatory burden than a global social network like Facebook, or a global online marketplace like Amazon.

How should I change or develop my AI models to comply with the DSA?

Companies using or developing recommender systems will have to adapt the way they design these algorithms and the way they present the results to the end-users.

This means that the end-users should:

  • Be informed whenever the content that they are seeing is the result of some sort of AI algorithm
  • Be informed about the type of attributes that are being used (e.g. past activity on the website or on other websites, geographic information, etc.)
  • Have the possibility to opt-out of such personalized recommendations, meaning that for every system relying on AI-based personalization, an alternative option that does not use profiling should be offered to the user. Note: profiling here means any situation where a digital representation of an individual is made by collecting information about them specifically, as in other EU legislation.

Three examples below will make it more concrete.

Concrete examples

Example 1: Candidate-job matching platform

Platforms linking job posts with candidates based on advanced matching algorithms have become increasingly popular over the recent years, allowing them to make relevant recommendations, to both candidates and employers. Since the benefit of personalized recommendations is particularly clear in this case, we can expect many users to happily continue using the AI-based recommendations. However, users should be informed about it and should be offered the possibility to browse through the job posts based on non-personalized criteria. The idea is to allow users to gain more control over which job postings they see. This can be achieved using, for example, deterministic drop-down filters which behave in a way that is predictable for the users.

Example 2: Social media platforms & user experience

On social media, the content that you see is often based on your previous activity (e.g. posts that you liked, shared, etc.) and is thus based on a form of profiling of the user. An alternate system that does not exploit profiling could simply be based on the latest available content, or the most popular content. As well, online platforms will have to inform their users when content is sponsored and mention on whose behalf the advertisement is displayed. The users should also be informed about the main parameters used to determine the displayed ads. 

The DSA will thus have a direct impact on the customer experience of users of online platforms (including social media networks, but also online marketplaces, app stores, online travel and accommodation websites, content-sharing websites, and collaborative economy platforms). UX/UI designers will have their part to play in ensuring compliance with the new legal requirements, all while still maintaining a smooth user experience. We expect to see new and interesting ways of informing users about the ways content is presented to them and ways to allow them to exercise their rights and freedoms. Luckily, the DSA does not force platforms to proactively ask users for their consent. We can, therefore, breathe easily, knowing that we won't be bombarded by a new myriad of pop-ups and consent banners polluting websites, like after the introduction of the GDPR and the ePrivacy Directive.

By when will I need to comply with the DSA?

In short, both the DSA and DMA are expected to be enforced at the beginning of 2024.

Currently, the DSA and the DMA are still at the stage of a proposal, but the European Parliament and Council have reached a “provisional agreement” in April 2022, meaning that the final approval should be no more than a formality. The latter is expected to take place at the end of October 2022. It will only be applicable fifteen months after entry into force - in January or February 2024 at the earliest.

What’s the risk?

The fines that companies can get for failing to comply with the regulation can reach up to 6% of the annual revenue of the infringing company. This is in line with other EU regulations such as the GDPR. 

As a CIO/CTO, what should be my next steps?

Due to the lack of jurisprudence at this stage, it is difficult to know for sure how this will be interpreted and applied in practice. It is however very clear that transparency is a key requirement of the DSA. Whenever a form of AI is used to make recommendations, the user must be informed and be given the option to opt out of the automatic filtering of content.

As a leader in the technology and innovation field, you should:

  • Start by identifying which of your tools and services are relying on (or offer) personalized recommendations (including advertisements). 
  • For those, you need to develop an alternative version that is not based on profiling but still provides a nice user experience. 
  • At the same time, you should work on the explainability of your algorithms in order to provide basic high-level explanations of the way the recommendations are made. 
  • If you are not developing everything in-house, that also means working with serious and reliable partners who, in turn, can guarantee the traceability of the data and the interpretability of the results of the algorithms. 

Keep in mind that every new regulation that brings constraints can also be converted into an opportunity, for example, to become more transparent and trustworthy to your end-users.

Interested in discussing your next steps toward future DSA compliance?

Book a chat with our Account Executive below.

 

 

Raphaël Peschi
About The Author

Raphaël Peschi

Raphaël is a Team Lead at Radix. Interested in Mathematics from a young age, he did a Master’s degree in Applied Mathematics at the University of Louvain-la-Neuve. After two professional experiences in an international environment, he joined Radix. Here, he wants to leverage Machine Learning to address clients’ challenges and share his passion for new knowledge with the team. Raphaël is also a Teaching Assistant at the University of Brussels (Solvay).

About The Author