European Law Journal

Publisher:
Wiley
Publication date:
2021-02-01
ISBN:
1468-0386

Latest documents

  • The public interest dimension of the single market for data: Public undertakings as a model for regulating private data sharing

    Data plays a crucial role for society. Accordingly, building a ‘single market for data’ by increasing the availability of public and private data ranks high on the EU policy agenda. But when advancing legal data sharing regimes, there is an inevitable need to balance public and private interests. While the European Commission continues to push for more binding rules on data sharing between private businesses, public undertakings are already covered by mandatory rules. Exploring how the law addresses their data offers valuable lessons on the reconciliation of market reasoning with the public interest. In particular, this article inquires into the recast Open Data and Public Sector Information Directive, the Data Governance Act, and different national rules which regulate access to and re‐use of public undertakings' data. It identifies five striking characteristics and discusses their potential and limitations for regulating data sharing by private undertakings. The implications serve as a guidepost for advancing the wider debate on building a single market for data in the EU. Some of them are already reflected in the upcoming EU Data Act.

  • The quadrangular shape of the geometry of digital power(s) and the move towards a procedural digital constitutionalism

    The paper explores the evolution of private powers in the digital landscape, developing a quadrangular systematisation of such a phenomenon based on four main aspects: space, values, (private) actors, and (digital) constitutional remedies. Taking a trans‐Atlantic approach, the paper shows how these categories, typical of constitutionalism, apply to the context of the Internet and of new digital technologies both in the United States and in Europe. On the one hand, the United States has up to now maintained the supremacy of the notorious Section 230 of the Communications Decency Act. On the other hand, European legislation has undergone a significant change, moving from a phase of digital liberalism, of which the 2000 E‐Commerce Directive is the emblem, towards a new era of digital constitutionalism, passing through the age of judicial activism of European courts. In this sense, Europe has increasingly attempted to introduce limits to private (digital) powers, with a view to better protect and enforce (also horizontally) users' fundamental rights. Additionally, the evolution of digital constitutionalism, from a vertical‐sectoral approach to a horizontal and procedure‐based one, significantly showcased by the recent Digital Services Package, is underscored, signalling the recent movement of the EU into its second phase of digital constitutionalism. In this respect, the paper argues that the great benefit of stressing the procedural dimension, which may be defined as a European application of “due (data) process” to the relationship between individuals and private powers, is that it is potentially able to help consolidate a (necessary) trans‐Atlantic bridge.

  • Manipulation by algorithms. Exploring the triangle of unfair commercial practice, data protection, and privacy law

    The optimisation of sales practices in consumer markets through machine learning not only harbours the potential to better match consumer preferences with products, but also risks to facilitate the exploitation of consumer weaknesses discovered via data analysis. More specifically, recent technological advances have brought us to the edge of mind‐reading technologies, which automatically analyse mental states and adapt offers accordingly, in potentially manipulative ways. This article shows that, in market contexts, the challenges of manipulation by algorithm necessitate an integrated understanding of unfair commercial practice, data protection, and privacy law. It maps the interactions between these contiguous yet distinct fields of law, and draws on economics and computer science to develop a novel framework to deal with algorithmic influence. Furthermore, it critically discusses the Commission proposals for the Digital Services Act and the Artificial Intelligence Act, and suggests to complement them with more broadly applicable measures to mitigate algorithmic manipulation.

  • The collective welfare dimension of dark patterns regulation

    Dark Patterns are interface design elements that can influence users' behaviour in digital environments. They can cause harm, not only on an individual but also a collective level, by creating behavioral market failures, reducing trust in markets and promoting unfair competition and data dominance. We contend that these collective effects of Dark Patterns cannot be tackled by existent laws, and thus call for policy intervention. This article reviews how existing and proposed laws in Europe and the US, namely the EU Digital Services Act and Digital Markets Act as well as the U.S. DETOUR and AICO Acts, address these collective dimensions of welfare and add to existing protection. We find that the novel legislative measures attain that goal to varying degrees. However, the collective welfare perspective may prove useful to both support a risk‐based approach to the enforcement and provide guidance as to which practices should be addressed as priority.

  • Data retention and the future of large‐scale surveillance: The evolution and contestation of judicial benchmarks

    Recent and upcoming judgments of the Court of Justice of the European Union (CJEU) have resurfaced a much‐debated topic on the legal limitations of law enforcement authorities and intelligence services under EU law in implementing surveillance operations. In its decisions, the CJEU has reinstated and at times remoulded its case‐law on data retention, unearthing a variety of legal issues. This article aims to critically analyse the legal limitations of (indiscriminate) surveillance measures, the role of the private sector in the scheme, and the line between the competence of the Member States and that of the EU on national security matters. It also aims to remark on the latest developments on the reception of the decisions by the Member States and the EU legislator, as well as on the ongoing dialogue between the CJEU and the European Court of Human Rights (ECHR).

  • Taking fundamental rights seriously in the Digital Services Act's platform liability regime

    This article highlights how the EU fundamental rights framework should inform the liability regime of platforms foreseen in secondary EU law, in particular with regard to the reform of the E‐commerce directive by the Digital Services Act. In order to identify all possible tensions between the liability regime of platforms on the one hand, and fundamental rights on the other hand, and in order to contribute to a well‐balanced and proportionate European legal instrument, this article addresses these potential conflicts from the standpoint of users (those who share content and those who access it), platforms, regulators and other stakeholders involved. Section 2 delves into the intricate landscape of online intermediary liability, interrogating how the E‐Commerce Directive and the emerging Digital Services Act grapple with the delicate equilibrium between shielding intermediaries and upholding the competing rights of other stakeholders. The article then navigates in Section 3 the fraught terrain of fundamental rights as articulated by the European Court of Human Rights (ECtHR) and the Court of Justice of the European Union (CJEU) under the aegis of the European Convention on Human Rights and the EU Charter. This section poses an urgent inquiry: can the DSA's foundational principles reconcile these legal frameworks in a manner that fuels democracy rather than stifles it through inadvertent censorship? Section 4 then delves into the intricate relationship between fundamental rights and the DSA reform. This section conducts a comprehensive analysis of the key provisions of the DSA, emphasising how they underscore the importance of fundamental rights. In addition to mapping out the strengths of the framework the section also identifies existing limitations within the DSA and suggests potential pathways for further refinement and improvement. This article concludes by outlining key avenues for achieving a balanced and fundamental rights‐compliant regulatory framework for platform liability within the EU.

  • Did the PNR judgment address the core issues raised by mass surveillance?

    This article looks at three main issues raised by the PNR scheme: (i) the base‐rate fallacy and its effect on false positives; (ii) built‐in biases; and (iii) opacity and unchallengeability of the decisions generated, and at whether the Court has properly addressed them. It concludes that the AG and the Court failed to address the evidentiary issues including the base‐rate fallacy—a lethal defect. It also finds that neither the Member States nor the Commission have even tried to assess whether the operation of the PNR Directive has resulted in discriminatory outputs or outcomes; and that the Court should have demanded that they produce serious, verifiable data on this, including on whether the PNR system has led in practice to discrimination. But it also finds that the AG and the Court provided important guidance on the third issue, in that they made clear that the use of unexplainable and hence unreviewable and unchallengeable “black box” machine‐learning artificial intelligence (ML/AI) systems violates the very essence of the right to an effective remedy. This means that any EU Member State that still uses such opaque ML/AI systems in its PNR screening will be in violation of the law.

  • Issue Information

    No abstract is available for this article.

  • Democracy through law The Transatlantic Reflection Group and its manifesto in defence of democracy and the rule of law in the age of “artificial intelligence”
  • Passenger name record (PNR) data: How the EU is promoting (virtual) security by actually limiting Passengers' fundamental rights

    The use for security purposes of airline passenger data (PNR) has gradually come to the fore especially in EU‐US relations because of the tension between those who considered the use of PNR an effective tool in the fight against terrorism and those who considered the interference in citizens' privacy disproportionate. The Court of Justice intervened decisively on the issue in June 2022 with the “Ligue des Droits Humains” Judgment C‐817/19. This ruling should have been followed by a review of the national legislations that transposed the Directive. On the contrary, the Member States are still going in the opposite direction to that indicated by the Court.

Featured documents