| Summary: | Differential Privacy (DP) is a technology which allows one to gather aggregate information without compromising individual privacy. Over the last few years, it has become the state-of-the-art privacy-enhancing technology. DP has been implemented by several Big Tech companies, as well as governmental bodies, but research in applied contexts is still at a very early stage. As differential private algorithms have an inherent accuracy-privacy trade-off and no guarantees of an equal accuracy loss for different dataset subgroups, when applied in practical settings the accuracy drop could have significant impacts to consumers.
This thesis aims to understand the social and technical repercussions of implementing DP in Credit Risk Assessment Models in the UK Consumer Credit Industry from a consumer centred perspective. To achieve this, a sociotechnical approach was employed using a combination of qualitative and technical work. The first qualitative studies were an exploratory user interviews about the application process and an interview-based industry stakeholder consultation. The technical element was the implementation and comparison of different differentially private decision tree-based algorithms. The thesis culminates in an interactive game study to gather consumers’ attitudes towards the implementation.
Findings from the technical study found that the DP algorithms had a negligible accuracy drop for specific amounts of privacy when compared to a non-private algorithm and rare occasions of disparate accuracy loss. Triangulating these findings with the knowledge on the workings of the consumer credit industry from the industry consultation we can deduce that if DP was implemented, the majority of consumers would not be significantly affected, with the exception of the consumers that are closer to the threshold of being denied credit.
The implementation of DP would be dependent on the amount of accuracy loss and regulatory encouragement, according to the industry consultation findings. To compensate for the implementation, lenders could change their credit policy to account for the small increase in uncertainty in the risk scores. This could make credit less accessible, which goes against regulatory aims, and hence not likely to have regulatory support.
Consumers also had very mixed views regarding the implementation of DP, as they would rather have better financial options than protect their personal data. These findings are based on the interactive game study, which communicated potential scenarios of the implementation of DP in the risk assessment model in the loan application process to gather consumers' attitudes towards the technology.
Based on these findings DP is unlikely to be implemented, as lenders would require some regulatory encouragement which seems unlikely unless there is a shift in public opinion. This work contributes to the underrepresented area on usable DP and consumers' requirements and attitudes towards the loan application process in the UK consumer credit industry.
|