Fundamental rights impact assessment - a practical tool for protecting fundamental rights

AuthorEuropean Union Agency for Fundamental Rights (EU body or agency)
Pages87-100

FUNDAMENTAL RIGHTS IMPACT
ASSESSMENT APRACTICAL TOOL
FOR PROTECTING FUNDAMENTAL
RIGHTS
Chapter 4 illustrated the extent to which using AI affects different fundamental
rights. This chapter analyses how fundamental rights impact assessments
(FRIA) could reduce the negative impacts that using AI can have on
fundamental rights.
Section 5.1 provides a brief overview of the current discussion on the need
for fundamental rights impact assessments in this f‌ield. Section 5.2 analyses
current practices in addressing fundamental rights implications, based on the
interviews conducted for this report. Interviewees were asked about what
sort of testing was done before the system was used, and who controls the
tasks affected by the use of the technology.
The chapter ends with suggestions on how to assess the fundamental rights
impact when using AI and related technologies.
.. CALLING FOR AFUNDAMENTAL RIGHTS IMPACT
ASSESSMENT AVAILABLE GUIDANCE AND TOOLS
International organisations,
1
academics
2
and civil society
3
have called for
fundamental rights impact assessments to be conducted when using AI or
related technologies.
For example, the Committee of Ministers of the Council of Europe’s guidelines
on addressing the human rights impacts of algorithmic systems recommend
that states should conduct “impact assessments prior to public procurement,
during development, at regular milestones, and throughout their context-
specif‌ic deployment in order to identify the risks of rights-adverse outcomes”.
4
There is a need for f‌lexible impact assessments that can adapt to different
situations given that fundamental rights violations are always contextual.
Scholars exemplify this based on EU anti-discrimination law, where equality
is always contextual and depends on the case at hand.5
Fundamental rights compliance cannot be automated and hard-coded into
computer software. Rather, each use case needs separate examination
to determine whether any fundamental rights issue arises. Nevertheless,
assessments can follow a systematic approach and provide similar information.
Existing standards provide guidance on how to do a fundamental rights impact
assessment of AI and related technology. These include hard law, soft law

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT