Label Propagation in Machine Learning Systems: Providing End-to-End Traceability with Explainable Artificial Intelligence
Permanent link
https://hdl.handle.net/10037/33828Date
2024-05-15Type
MastergradsoppgaveMaster thesis
Author
Ingebrigtsen, Marius JohanAbstract
Artificial Intelligence (AI) and the underlying Machine Learning (ML) technology is experiencing increased applications in various areas. The training of ML models requires significant amounts of data, and data might contain restrictions regarding their permitted usage. High-performant models are often called black-boxes because of their complex decision-making process. Thus, ML applications threaten compliance with data restrictions by the lack of explainability with this technology.
Data labels can enforce data restrictions in a system’s computational pipeline by being propagated from input to procedure output. A Label Propagation Mechanism (LPM) can employ an influence-based policy to propagate labels of input data that contribute towards the computation of the output. However, the application of influence-based label propagation in ML faces challenges due to the complete cross-taint of information inside these models. This thesis proposes an influence-based LPM that employs explanations from Explainable Artificial Intelligence (XAI) to propagate input labels to ML outputs.
This thesis concerns the proof of concept regarding the application of XAI to propagate to the output of a black-box ML model only the labels of inputs that have a high influence to that output. We first bridge the gap between conventional label propagation and the problematic application in ML. We then detail how LPMs can use XAI explanations to inform their label propagation. Next, we design and execute experiments with different XAI methods, models, and data. We evaluate the results based on the propagated labels and the faithfulness of the explanations for the model output.
Publisher
UiT Norges arktiske universitetUiT The Arctic University of Norway
Metadata
Show full item recordCollections
Copyright 2024 The Author(s)
The following license file are associated with this item: