Runtime Verification for Algorithmic Fairness

Tobias Wagenpfeil

Throughout our lives, we are increasingly subjected to decisions made by algorithms. Insurance eligibility, employment screening, and criminal justice sentencing are just some of their operational areas. Since they are deployed in such important fields of society, these algorithms have to take decisions or suggestions for decisions properly to guarantee fair and balanced outcomes. However, subsequent analyses of deployed algorithms have uncovered cases of significant bias towards some demographic subgroups. This thesis presents an approach to monitor the fairness of an algorithm during its execution. Translating fairness conditions into an RTLola specification makes it possible to generate the monitor. Then, the monitor allows us to check in real-time whether the fairness criteria are fulfilled during an execution of a certain algorithm or not. Furthermore, we investigate the practical feasibility of the approach by comparing several different algorithms for the example of assigning seminars to students.

Bachelor Thesis.

(pdf)