April 9, 2024

Regulatory soup of data

Data and data market has gotten a significant amount of regulatory attention in recent years in the European Union as the EU seems intent on increasing European data competitiveness with regulatory action. The goal of this article is to give a short summary overview of the “big 5” and even shorter overview of other data related legislative actions that can affect your jobs as data governance, data protection and privacy professionals. All this from a point of view of a practicing Data Protection Officer (DPO) with information management/ data governance background rather than a legal background.

Let’s start with clarifying what is an “EU act” compared to “EU directive” and “EU regulation”. “Act” is also a “regulation” meaning that EU act applies directly like a regulation would, without needing member state implementation laws. But as with any EU legal acts there are exceptions and derogations and the member states have to have knowledgeable people in the public sector to guide the application of all these legislative acts. This applies also to aforementioned “big 5”. Seeing, that the lawmakers did not seem to coordinate between different workgroups to make sure that data terms used to describe concepts in the law are the same across all these different data management acts, sets new challenges to all the practitioners in the field who now have to implement these contradicting requirements. With different interpretations of the new rules by various stakeholders will it be possible to create an elegant and clear data governance system or will we end up with more of a tangled web – time will tell.

The “big 5” are:

–          Data Act (DA)

–          Data Governance Act (DGA)

–          Data Services Act (DSA)

–          Digital Markets Act (DMA)

–          AI Act (AIA)

Missing in action but discussed a little when I talk about DSA and DMA is ePrivacy Regulation, the best friend of the GDPR and the culprit behind cookie walls. It appears to have disappeared into the regulatory wormhole. Result is a deafening silence around the ePrivacy and regulating tracking and therefore also digital marketing as contemporary web marketing is based on tracking you and not asking what you would prefer to hear about. But more on that in the article.

Name: Data Act (DA)

Status: entered into force in November 2023

Applies to: manufacturers of connected devices (or internet of things or IoT) and related services, data holders. IoT means for example – smart cars, wearables, connected medical devices etc. DA focuses on “industrial” and non-personal data.

Significance: DA establishes the basic cross-sectoral framework for data sharing across the IoT within the EU  and gives users (including companies using smart cars for example) of smart devices access to the data, that these devices collect about the users. DA makes another attempt to bring the concept of “data portability” to the EU law – as manufacturers and designers of smart devices have an obligation to share data with third parties designated by users in addition to users themselves.

Issues: There are many potential implementation challenges. Are GDPR’s controller, processor and sub-processor same-same as Data Act’s data holder, data recipient and user, or different? If different, then how? Where is this defined?

Data transfer in DA is defined rather vaguely. Unlike in the GDPR there are no derogations in DA to data transfer restrictions, except if receiving country has a ‘conflict of law’  with the EU or member state law. What is this prohibitive ‘conflict of law’ is not clear in DA. This means, there could be a greater impediment to companies’ ability to transfer non-personal data than there is to transfer personal data.  Most companies have non-personal and personal data and for the sake of simplicity just apply the GDPR for data transfer schemes. Would DA mean creation of a duplicate data transfer system within a company? How does the GDPR transfer regime interact with Data Act transfer regime?

Another issue is that DA says “access to any data stored in and accessed from terminal equipment is subject to ‘ePrivacy directive’ and requires the consent of the subscriber or user within the meaning of that Directive unless it is strictly necessary for the provision of an information society service explicitly requested by the user or subscriber (or for the sole purpose of the transmission of a communication)”. Most IoT services could be made to fit the purpose of “explicitly requested by the user”, but those that do not fit that purpose will have to ask consent. Imagine consent walls on smart devices? Rather not, right?

Name: Data Governance Act (DGA)

Status: entered into force in June 2022

Applies to: public sector bodies including government-financed and -managed organisations and entities that exist to serve the needs of the public.

Significance: aims to establish a regulatory framework for the management, exchange and use of data within the EU and gives guidelines for the re-use of data held by public bodies within the EU. Data that has legal restrictions upon it, such as personal data, must be re-used within these restrictions. The Act also aims to create a framework for the collection and provision of data brokerage services (intermediaries).

Issues: DGA definition of who is a data intermediary is vague, meaning some organisations who would fall under DGA may not see themselves as intermediaries. As with Open Data and the Re-Use of Public Sector Information (‘Open Data Directive’ EU 2019/1024) there is a potential conflict with the GDPR as the public sector in all member states have a lot or personal data in their databases. Public data and open data are two different concepts, public data is data in public domain, open data can be used for whatever data analytical purposes with a creative commons licence. In a number of member states public data is equalled with open data when implementing Open Data Directive to the member state law and resulting in personal data processed not in compliance with the GDPR in public databases. DGA also fails to define exactly what is and what isn’t open data. So, I feel for DPO-s in public sector who on a daily basis have to make these conflicting laws work.

Lawmakers also introduced a nice, but a rather esoteric, principle of “data altruism”. Like with the “data portability” requirement in the GDPR, where the lawmakers meant well, but the result has not been what was intended, it is likely remain a concept. Ongoing problem with a lot of public sector datasets is their rather poor data quality for analytical purposes and neither DGA nor Open Data Directive regulate data standards or quality principles. How is the altruistically shared data quality and integrity ensured? A large quantity of poor quality data is not necessarily valuable or useful.

Name: Data Services Act (DSA)

Status: entered into force in November 2022

Applies to: online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores and online travel and accommodation platforms.

Significance: DSA’s main goal is to prevent illegal and harmful activities online and the spread of disinformation by creating new procedures for faster removal of illegal content, implementing additional transparency measures, including on online advertising and on algorithms used to recommend content to users. DSA and Digital Markets Act are supposed to work together to reign in the monopolies and address transparency around tracking or as they put it in DMA – consent.

Issues: One of the goals of DSA was to clarify rules around digital advertising, but when DSA does talk about advertising, it only talks about it in connection to ‘very large search engines’ and ‘very large online platforms’. DSA doesn’t talk about tracking or tracking technologies and individuals right not to be tracked, like for example in the California Privacy Protection Act. I suppose this was something, that was planned in the ePrivacy regulation. As I mentioned before, ePrivacy is like a family secret, everyone knows it is there, but no-one ever talks about it.

Compared to shiny new regulations on fashionable subjects like AI – ePrivacy, with its conflicts and an ambition to change the whole digital marketing business model, is not an easy subject. Yet digital marketing and advertising affects all businesses in Europe. Neither DSA nor ePrivacy directive address the tracking issue or making end user consent effective and transparent. How can consent be effective when people do not understand what they are consenting to? Perhaps move the consent in the form of user preference to a browser so people can set their privacy and tracking preferences in the browser. I should be able to set my preference on how I am advertised to. Tracking based digital advertising is from one hand intensely personal – you are tracked across all your digital space, but from the other hand, it is very impersonal, because no-one ever thinks of asking what you like instead of tracking you.

Name: Digital Markets Act (DMA)

Status: in force since November 2022

Applies to: gatekeepers, large digital platforms providing so called core platform services, such as online search engines, app stores, messenger services. At the moment there are six gatekeepers – Alphabet, Amazon, Apple, ByteDance, Meta, Microsoft and in total, 22 core platform services provided by gatekeepers have been designated. The six gatekeepers have until end of March 2024 to ensure full compliance with the DMA obligations and prohibitions listed in the DMA.

Some of the gatekeepers disputed their designation and after deliberation the European Commission concluded that, although Gmail, Outlook.com and Samsung Internet Browser meet the thresholds under the DMA to qualify as a gatekeeper, Alphabet, Microsoft and Samsung provided sufficiently justified arguments showing that these services do not qualify as gateways for the respective core platform services. Therefore, the Commission decided not to designate Gmail, Outlook.com and Samsung Internet Browser as core platform services. It follows that Samsung is not designated as gatekeeper with respect to any core platform service.

Significance: As the European Comission press release stated “DMA together with the DSA is intended to address the negative consequences arising from certain behaviours by online platforms acting as digital gatekeepers to the EU single market”. In other words, to try to regulate market abuse and misuse of individuals data by very large platforms, search engines and intermediaries.

Issues: First issue is that how do you define who is a gatekeeper? One of the challenges to the EU is determining a futureproof and commonly agreed methodology, and quantitative thresholds to qualify gatekeepers. The disputes over gatekeeper assignations started right after DMA came into force.

Another big issue is that there are major compatibility and consistency issues between the DMA, the GDPR and the ePrivacy Directive. Also, when talking about digital marketing, DMA and DSA pay little heed to actual technologies used by big market players in digital advertising. When talking about lawful ways of advertising, the lawmakers keep referring to ePrivacy Directive (EU 2002/58) for when consent is needed and to the GDPR how it should be asked. Meanwhile, in 21 years since the ePrivacy Directive was adopted (and 13 years since it was last updated and 3 years since ePrivacy directive has lingered in the EU trilogue), the world has moved on with tracking technologies a way beyond cookies. An individual can be tracked by browser fingerprinting, behavioural profiling, Etag, tracking pixels etc, which begs a question how effective tool is consent in managing tracking across the technologies, when majority of the people still struggle to understand what a cookie is? Transparency and accountability are important to individuals and these are main principles of the GDPR, so how does DMA with DSA and ePrivacy Directive help companies to make the use of trackers more transparent and optional to individuals? I don’t have an answer.

In my personal opinion it seems that the main goal of the DMA is dismantling of monopolies as DMA asks designated gatekeepers to step aside so that the non-gatekeepers services could also be chosen by a person. Also, gatekeepers should not be self-preferencing their services in search results, or cross use the data obtained from their subsidiaries, limiting business users’ dealings with end users etc. It is unclear where and how the fundamental rights of an individual, namely right to privacy and data protection, fit in the DMA.

Reading the big five data related acts I do get a feeling, that somehow while all these acts are regulating the data market, the lawmakers did not consult with each other or existing legislation when creating the acts. The regulatory acts together remind me of a process of trying to describe an elephant in the blind when touching different parts of the elephant – some think it is a horse, some are convinced it is a snake and some think it is a hippopotamus.

Name: AI Act (AIA)

Status: passed trilogue in March 2024.

Applies to: providers and users of AI systems in the EU and outside of the EU, if “output of the AI system” is used in the EU.

Significance: creates new data governance compliance requirements for a large number of organisations.

Issues: AI based analytical tools are absolutely great, and in some cases, profoundly game changing, but I would still argue that the game is still the game. If you are planning to use or develop an AI system/tool, but do not have in place decent data governance rules including information security, data quality and data analytics rules, then just applying AIA’s requirements doesn’t give you an AI tool/system that is useful. Whatever privacy issues you were facing before AI – will be amplified by a magnitude when you start using AI.

AIA creates requirements for the users of AI tools to conduct a Fundamental Rights Impact Assessment (FRIA). As a privacy practitioner, it is hard to see difference between Data Protection Impact Assessment (DPIA) under GDPR and FRIA. And to understand why there is a need to create a separate impact (risk) assessment for an AI system? In practice, as AI tools will become more pervasive – we will probably see these two types of risks assessments blending and used together.

The creators of AI systems must conduct additional risk assessment including following obligatory areas where companies have to demonstrate that they have established:

–          data governance (organisation has established data governance system),

–          technical documentation (of the system), record keeping (of the development),

–          transparency and provision of information (is the use of data clear to data subjects),

–          human oversight (the system is not released without human inspection),

–          accuracy, robustness, and cybersecurity (means data quality, analytics and cybersecurity rules are implemented).

This requires foundation models, and AI systems built with them, to draw up better documentation for increasing transparency – such as what data the model was trained on and what security measures were used.

There are pre-defined high-risk systems that can’t be developed such as social scoring for public and private purposes, use of subliminal techniques to exploit people’s weaknesses, racial or religious profiling or individual predictive policing, emotion recognition in the workplace and education institutions and untargeted scraping of internet (or CCTV).

There will be a new EU AI Office set up to implementation, compliance and enforcement – in the EU, we have data protection regulators and now we have AI regulator – in my opinion they are “tentacles”, that are attached to “data governance” as the main body. Yet data governance is not regulated as a whole – the lawmakers talk about single digital market, data protection, artificial intelligence, taming monopolies, but data governance that overarches all these topics, is not directly addressed or have any compliance rules applying to it. Perhaps it makes sense that way, but essentially it seems we are trying to introduce good data governance rules by giving rules how a leg and an arm should behave to make the body move in desired direction.

We can’t forget also that AIA is complemented by AI liability proposal and Product Liability Directive which is about to enter into force, the latter bringing product liability regime in line with technological developments, covering digital products like software, including AI, also influencing use and creation of AI systems.

As AIA starts being implemented, we will see if the new compliance rules will create more fair and transparent AI systems and reduce risks around AI system development. But creating an effective data governance system is a must have for anyone using or creating AI systems as AIA brings with it penalties for non-compliance, there are fines of up to 35 million EUR, or 7% of an organisation’s global turnover.

Other EU legislative acts relevant to data protection, privacy and data governance professionals, depending on the industry you work in, that I wont cover in this article, are:

– Directive on representative actions for the protection of the collective interests of consumers (EU 2020/1828). The directive aims to ensure that consumers can protect their collective interests in the EU via representative actions, the legal actions brought by representative entities (so called qualified entities). It provides that all EU countries have in place a mechanism of representative actions. The Directive improves consumers’ access to justice, while it also foresees appropriate safeguards to avoid abusive litigation.

–          Directive on open data and the re-use of public sector information (EU 2019/1024) also known as “Open Data Directive”. The goal of open data directive is to help the public to create value from the data in the hands of public sector. Public sector has a task to make the data in the public sector available for value creation while at the same time ensuring that restricted data, such as personal data is published in accordance with these restriction. In case of personal data this would mean that publishing personal data in public databases has to be done in accordance witht the GDPR and as a minimum apply encryption, pseudonymisation or anonymisation as a minimum.

–          Directive on measures for a high common level of cybersecurity across the Union (EU 2022/2555) or NIS2 Directive applies those with more than 50 employees and an annual turnover greater than 10 million euros and to providers of essential services.

– Digital Operational Resilience Act also known as DORA, aims at strengthening the IT security of financial entities such as banks, insurance companies and investment firms and making sure that the financial sector in Europe is able to stay resilient in the event of a severe operational disruption. It applies to 20 different types of financial entities and ICT third-party service providers.

– Cyber Resilience Act is a legislative proposal to introduce security requirements for connected devices, from smart toys to industrial machinery. The CRA applies to manufacturers and importers of “products with digital elements” (“PDEs”), a category which is defined broadly to include both hardware and software products. There is a focus on certain “Important” or “Critical” PDEs. The final list of PDEs in these categories has not yet published, but it is likely to include items covering both software (such as antivirus software and VPNs) and connected devices such as “smart home” devices, connected toys, and wearables.

–          Regulation on a framework for Financial Data Access was proposed in June 2023, the first set of measures include amendments to modernise the payment services regime (PSD2), and to establish a Payment Services Regulation (PSR). These amendments will ensure consumers can continue to make electronic payments and transactions in a safe and secure manner, both nationally or cross-border, in euro and non-euro.

–          Proposal for a Regulation COM (2022) 197 – EHDS-European Health Data Space – is yet to progress, but aims to create a common European health data space enabling the secure and interoperable exchange of medical information between EU Member States. The main objective is to facilitate research, innovation and the effectiveness of health policies and to promote public health. EHDS would be a legal framework for health data access and exchange, ensure high quality of data and interoperability. Individuals will have access to their own data and healthcare professionals in member states will be able to access electronic health across the EU. EHDS will enable anonymised health data to be downloaded and used for a variety of purposes, including training AI applications in the healthcare sector subject to specified conditions

Share this post:

Discover more articles