Goods and services (Directive 2004/113)

AuthorJenny Julen Votinius
9 Goods and services (Directive 2004/113)49
9.1 General (legal) context
9.1.1 Surveys and reports about the difficulties linked to equal access to and supply of
goods and services
No surveys have been conducted concerning the difficulties linked to equal access to supply
of goods and services.
9.1.2 Specific problems of discrimination in the online environment/digital
market/collaborative economy
There is no national research on the frequency of gender-based discrimination resulting
from the use of algo rithms or technologies using artificial intelligence. However, in 2018,
the Equality Ombudsman initiated a project to map and estimate the risks of discrimination
in algorithms, automated data processing and artificial intelligenc e. Based on exi sting
international scholarship, the project identified a number of typical risks of discrimination
in automatised decision-making. These risks may relate to: the data (non-relevant, biased
or low-quality), the programmer (intentional discrimination, unconscious prejudices, or
bad understanding of the effects of certain types of programming), or the organisational
context in which the algorithms appear and are used (one example is that programs from
different producers which are not discriminatory when used separately may have a
discriminatory effect when combined in a program structure, another example is when the
context in which the p rograms are used does not provide for checks or follow-ups). The
risk may also relate to machine learning and black box systems.
The aim of the project on algorithmic d iscrimination carried out by the Equality
Ombudsman was t o increase the level of kn owledge in this field, mainly at the authority
level. The project had the charact er of a mappi ng exercise and conclud ed that there is a
need to increase the public awareness of artificial intelligence and the risk of algorithmic
discrimination, a need for competence in this field at authorities and organisations, and a
need for increased cooperation and exchange of knowledge between authorities. The
opinion of the national expert is that these observations are relevant; the need for research
and increased competence on algorithmic discrimination appears to be urgent.
In 2017, the Government tasked the Swedish Social Insurance Inspectorate (Inspektionen
för socialförsäkringen, ISF) to analyse the use of ‘selection p rofiles’ at the Swedish social
insurance agency. The ISF has delivered two reports on the matter.50 An important aspect
of the analysis carried out by the ISF is the matter of equal treatment of men and women.
The ISF explains how results are generated by machine learning algorithms, and why such
results run the risk of being biased on the ground of, inter alia, gender. Th e conclusion is
that the use of ‘selection profiles’ di splays deficits in the area of equal treatm ent, but the
ISF also underlines that this does not necessarily mean a violation of t he Discrimination
Act (2008:568). The reports concern the statutory social security scheme, but the resul ts
are equally relevant for similar products on the private market.
49 See e.g. Caracciolo di Torella, E. and McLellan, B. (2018), Gender equality and the collaborative economy,
European network of legal experts in gender equality and non-discrimination, available at:
50 Swedish Social Insurance Inspectorate, Profilering som urvalsmetod för riktade kontroller (Profiling as
selection method for targeted controls), ISF Report 2018:5, and Riskbaserade urvalsprofiler och
likabehandling (Risk based selection profiles and equal treatment), Working Report 2018:1AR, available (in
Swedish only) at: and

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT