Nonetheless, one can see that not all features shown in Figure 1 and Tables 1-2 would be relevant for loan applicants looking for recommendations to get their credit approved. So, SHAP may be adjusted further to compute weights only for a subset of features. Since SHAP deals with missing features by imputing default Counterfactual Explanations for Data-Driven Decisions Fortieth International Conference on Information Systems, Munich 2019 8 values, one can delimit the set of relevant features by setting the default values of the irrelevant features equal to the current values of the instance. Then, SHAP will compute importance weights only for the features that have a value different from the default. We do this for Loan 4 and define loan amount and annual income as the only relevant features. This would make sense in our context assuming most customers can only ask for less money or show additional sources of income to get their credit approved.