Please use this identifier to cite or link to this item: http://gnanaganga.inflibnet.ac.in:8080/jspui/handle/123456789/632
Title: Responsible AI in automated credit scoring systems
Authors: Itapu, Srikanth
Keywords: Credit scoring
Disparate Impact Analysis
Explainable AI
Responsible AI
Issue Date: 7-Jun-2022
Publisher: Springer Link
Abstract: In the last few years, Artificial Intelligence (AI) has achieved a notable momentum that, may deliver the expectations over many application sectors across the field. For this to occur, expert systems and rule-based models need to overcome the limitation of fairness and interpretability. Paradigms underlying this problem fall within the so-called explainable AI (XAI) field. This report presents the work on German credit card dataset to overcome the challenges of fairness, bias and in return, deem the machine learning models giving a responsible expectation. This is defined as responsible AI in practice. Since the dataset we dealt with, is to classify credit score of a user as good or bad, using fair ML modelling approach, the key metric of interest is the F1-score to reduce share of misclassifications. It is observed that hyper parameter tuned XGBoost model (GC2) gives optimal performance in terms of both F1-score, accuracy and fairness for the case of both gender and age as protected variable through Disparate Impact Remover, a pre-processing bias mitigation technique. The same is deployed using both Heroku through Flask API (for age). The Disparate Impact Analysis (DIA) using H2O.AI helped to identify optimum threshold levels at which the fairness metrics are observed at legally acceptable/permissible levels for both age and gender. Overall, fairness, bias responsibility and explainability have been established for the dataset considered.
URI: https://doi.org/10.1007/s43681-022-00175-3
http://gnanaganga.inflibnet.ac.in:8080/jspui/handle/123456789/632
Appears in Collections:Journal Articles

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.