The right to explanation

Dirutigliano, Jacopo (2023) The right to explanation, [Dissertation thesis], Alma Mater Studiorum Università di Bologna. Dottorato di ricerca in Law, science and technology, 35 Ciclo.
Documenti full-text disponibili:
[img] Documento PDF (English) - Accesso riservato fino a 12 Maggio 2026 - Richiede un lettore di PDF come Xpdf o Adobe Acrobat Reader
Disponibile con Licenza: Salvo eventuali più ampie autorizzazioni dell'autore, la tesi può essere liberamente consultata e può essere effettuato il salvataggio e la stampa di una copia per fini strettamente personali di studio, di ricerca e di insegnamento, con espresso divieto di qualunque utilizzo direttamente o indirettamente commerciale. Ogni altro diritto sul materiale è riservato.
Download (9MB) | Contatta l'autore

Abstract

This research investigates the use of Artificial Intelligence (AI) systems for profiling and decision-making, and the consequences that it poses to rights and freedoms of individuals. In particular, the research considers that automated decision-making systems (ADMs) are opaque, can be biased, and their logic is correlation-based. For these reasons, ADMs do not take decisions as human beings do. Against this background, the risks for the rights of individuals combined with the demand for transparency of algorithms have created a debate on the need for a new 'right to explanation'. Assuming that, except in cases provided for by law, a decision made by a human does not entitle to a right to explanation, the question has been raised as to whether – if the decision is made by an algorithm – it is necessary to configure a right to explanation for the decision-subject. Therefore, the research addresses a right to explanation of automated decision-making, examining the relation between today’s technology and legal concepts of explanation, reasoning, and transparency. In particular, it focuses on the existence and scope of the right to explanation, considering legal and technical issues surrounding the use of ADMs. The research analyses the use of AI and the problems arising from it from a legal perspective, studying the EU legal framework – especially in the data protection field. In this context, a part of the research is focused on transparency requirements under the GDPR (namely, Articles 13–15, 22, as well as Recital 71). The research aims to outline an interpretative framework of such a right and make recommendations about its development, aiming to provide guidelines for an adequate explanation of automated decisions. Hence, the thesis analyses what an explanation might consist of, and the benefits of explainable AI – examined from legal and technical perspectives.

Abstract
Tipologia del documento
Tesi di dottorato
Autore
Dirutigliano, Jacopo
Supervisore
Co-supervisore
Dottorato di ricerca
Ciclo
35
Coordinatore
Settore disciplinare
Settore concorsuale
Parole chiave
right to explanation, explanation, personal data, data protection, GDPR, automated decision-making, profiling, artificial intelligence, machine learning, explainable AI, XAI, interpretable machine learning.
URN:NBN
Data di discussione
4 Luglio 2023
URI

Altri metadati

Gestione del documento: Visualizza la tesi

^