Published online by Cambridge University Press: 30 March 2021
This article examines the growth of algorithmic credit scoring and its implications for the regulation of consumer credit markets in the UK. It constructs a frame of analysis for the regulation of algorithmic credit scoring, bound by the core norms underpinning UK consumer credit and data protection regulation: allocative efficiency, distributional fairness and consumer privacy (as autonomy). Examining the normative trade-offs that arise within this frame, the article argues that existing data protection and consumer credit frameworks do not achieve an appropriate normative balance in the regulation of algorithmic credit scoring. In particular, the growing reliance on consumers’ personal data by lenders due to algorithmic credit scoring, coupled with the ineffectiveness of existing data protection remedies has created a data protection gap in consumer credit markets that presents a significant threat to consumer privacy and autonomy. The article makes recommendations for filling this gap through institutional and substantive regulatory reforms.
DPhil candidate, University of Oxford.
I would like to thank the following for their very helpful comments on earlier versions of this article: John Armour, Dan Awrey, Ryan Calo, Ignacio Cofone, Hugh Collins, Horst Eidenmüller, Mitu Gulati, Geneviève Helleringer, Ian Kerr, Bettina Lange, Tom Melham, Jeremias Prassl, Srini Sundaram, David Watson, participants in the Virtual Workshop on ML and Consumer Credit and workshops at the University of Montreal, McGill University, Singapore Management University, University of Luxembourg, Sciences Po, European University Institute, and University of Oxford (Faculty of Law and Oxford Internet Institute).
1 I. Goodfellow, Y. Bengio and A. Courville, Deep Learning (Cambridge, MA 2016), 2–8; Y. LeCun, Y. Bengio and G. Hinton, “Deep Learning” (2015) 251 Nature 436.
2 V. Mayer-Schönberger and K. Cukier, Big Data: A Revolution that Will Transform How We Live, Work and Think (London 2013), 73–97.
3 The phenomenon of analysing large and complex data sets is also commonly called “Big Data” (ibid.).
4 R. O'Dwyer, “Are You Creditworthy? The Algorithm Will Decide”, available at https://bit.ly/34WoTEs (last accessed 5 November 2020). Alternatively referred to as “big data scoring” or “alternative data scoring”: M. Hurley and J. Adebayo, “Credit Scoring in the Era of Big Data” (2016) 18 Yale Journal of Law and Technology 148.
5 E.g. A.E. Khandani, A.J. Kim and A.W. Lo, “Consumer Credit Risk Models via Machine Learning Algorithms” (2010) 34 Journal of Banking and Finance 2767.
6 E.g. D. Citron and F.A. Pasquale, “The Scored Society: Due Process for Automated Predictions” (2014) 89 Washington Law Review 1; C. O'Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (London 2016), 179–202.
7 N. Capon, “Credit Scoring Systems: A Critical Analysis” (1982) 46(2) Journal of Marketing 82; J. Lauer, Creditworthy: A History of Consumer Surveillance and Financial Identity in America (New York 2017).
8 E. Goffman, Frame Analysis: An Essay on the Organization of Experience (New York 1974).
9 Financial Services and Markets Act 2000, s. 1C.
10 Directive (EC) No 2008/48 (OJ 2008 L 133 p.66) (“Consumer Credit Directive”) and Regulation (EU) No 2016/679 (OJ 2016 L 119 p.1), respectively.
11 European Union (Withdrawal) Act 2018, s. 3; The Consumer Credit (Amendment) (EU Exit) Regulations 2018.
12 FCA Handbook, Consumer Credit Sourcebook (CONC) sections 5.2A (Creditworthiness assessment) and 5.5A (Creditworthiness assessment: P2P agreements) and Consumer Credit Directive, art. 8. See also FCA, “Assessing Creditworthiness in Consumer Credit – Feedback on CP 17/27 and Final Rules and Guidance”, available at https://bit.ly/2SS9ijA (last accessed 5 November 2020).
13 Capon, “Credit Scoring Systems”, 82.
14 L.C. Thomas, Consumer Credit Models: Pricing, Profit and Portfolios (Oxford 2009), 5–9.
15 Pursuant to the “Principles of Reciprocity”, available at https://scoronline.co.uk/key-documents/ (last accessed 5 November 2020).
16 Thomas, Consumer Credit Models, 63ff.
17 Equifax, “How are Credit Scores Calculated?”, available at https://bit.ly/2I7ppm6 (last accessed 5 November 2020).
18 W. Dobbie and P.M. Skiba, “Information Asymmetries in Consumer Credit Markets: Evidence from Payday Lending” (2013) 5(4) American Economic Journal: Applied Economics 256.
19 See BoE-FCA, “Machine Learning in UK Financial Services”, available at https://bit.ly/3l3YITa (last accessed 5 November 2020). However, note the methodological constraints (at 7).
20 T. O'Neill, “The Birth of Predictor – Machine Learning at Zopa”, available at https://perma.cc/8EXJ-JETA; J. Deville, “Leaky Data: How Wonga Makes Lending Decisions”, available at https://perma.cc/D9SB-TXDX; CGFS and FSB, “FinTech Credit: Market Structure, Business Models and Financial Stability Implications”, available at https://bit.ly/2W6yGF5, 3–6 (all last accessed 5 November 2020).
21 M. Bazarbash, “FinTech in Financial Inclusion: Machine Learning Applications in Assessing Credit Risk”, available at https://bit.ly/2U2zckG, 13 – 23 (last accessed 5 November 2020).
22 M.A. Bruckner, “The Promise and Perils of Algorithmic Lenders’ Use of Big Data”, (2018) 93 Chicago-Kent Law Review 3, 11–17 (distinguishing between two phases of “algorithmic lending”).
23 S. Rahman, “Combining Machine Learning with Credit Risk Scorecards”, available at https://bit.ly/2JDKObw (last accessed 5 November 2020).
24 T. Berg et al., “On the Rise of FinTechs – Credit Scoring Using Digital Footprints”, available at https://ssrn.com/abstract=3163781 (last accessed 5 November 2020); D. Björkegren and D. Grissen, “Behaviour Revealed in Mobile Phone Usage Predicts Credit Repayment” (2020) 34(3) The World Bank Economic Review 618.
25 Hurley and Adebayo, “Credit Scoring”, 168–83.
26 J.A. Sirignano, A. Sadwhani and K. Giesecke, “Deep Learning for Mortgage Risk”, available at https://arxiv.org/abs/1607.02470 (last accessed 5 November 2020).
27 G. Morgenson, “Held Captive by Flawed Credit Reports”, New York Times, at https://nyti.ms/2QaYxrG (last accessed 5 November 2020).
28 I. Berlin, “Two Concepts of Liberty” in Four Essays on Liberty (Oxford 1969).
29 CONC 5.2A.10ff. and 5.5A.11ff. (for p2p agreements); FCA, “Preventing Financial Distress by Predicting Unaffordable Consumer Credit Agreements: An Applied Framework”, available at https://bit.ly/33eVrs3 (last accessed 5 November 2020).
30 J. Armour et al., Principles of Financial Regulation (Oxford 2016), 53–54.
31 FCA, “Guidance for Firms on the Fair Treatment of Vulnerable Consumers”, available at https://bit.ly/351bcEc (last accessed 5 November 2020).
32 FCA, “Preventing Financial Distress”, 13–14.
33 O. Bar-Gill, Seduction by Contract (Oxford 2012), 51ff.
34 Armour et al., Principles, 222–23.
35 J. Rawls, “Justice as Fairness: Political not Metaphysical” (1985) 14 Philosophy and Public Affairs 223.
36 Armour et al., Principles, 51–80; J. Stiglitz, “Regulation and Failure” in D. Moss and J. Cisternino (eds.), New Perspectives on Regulation (Cambridge, MA 2009).
37 J.Y. Campbell et al., “Consumer Financial Protection” (2011) 25(1) Journal of Economic Perspectives 91.
38 Charter of Fundamental Rights of the European Union (OJ 2012 C 326 p.391) (EU Charter), arts. 7, 8; Treaty on the Functioning of the European Union (OJ 2016 C 202 p. 1) (Consolidated), art. 16.
39 Regulation (EU) 2016/679 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data (OJ 2016 L 119 p.1) (GDPR), art. 5(1). Directive (EC) No 2008/48 (OJ 2008 L 133 p.66) (“Consumer Credit Directive”) and Regulation (EU) No 2016/679 (OJ 2016 L 119 p.1), respectively.
40 Ibid., art. 6.
41 Ibid., art. 25.
42 Ibid., art. 35.
43 Consumer Credit Act 1974 (CCA), ss. 157–159; GDPR, arts. 14–16.
44 GDPR, recital 71, arts. 21, 22.
45 Ibid., arts. 13(2)(f), 14(2)(g), 15(1)(h).
46 L. Floridi, “The Informational Nature of Personal Identity” (2011) 21 Minds and Machines 549; J. Cheney-Lippold, We Are Data: Algorithms and the Making of Our Digital Selves (New York 2017).
47 A. Rouvroy and Y. Poullet, “The Right to Informational Self-determination and the Value of Self-development: Reassessing the Importance of Privacy for Democracy” in S. Gutwirth et al. (eds.), Reinventing Data Protection? (Dordrecht and London 2009).
48 GDPR, art. 9.
49 O. Lynskey, The Foundations of EU Data Protection Law (Oxford 2015), 89–130.
50 S. Warren and L. Brandeis, “The Right to Privacy” (1890) 4 Harv.L.Rev. 193.
51 As conceived in much of the (law and) economics literature on privacy: e.g. R. Posner, “The Economics of Privacy” (1981) 71(2) American Economic Review 405.
52 Judgment of 15 December 1983, 1 BvR 209/83, BVerfG 65, 1.
53 Berlin, “Two Concepts of Liberty”; J.E. Cohen, “What Privacy Is For” (2013) 126 Harv.L.Rev. 1904.
54 EU Charter, art. 8(2) and GDPR art. 5(1)(a) (the fairness principle); D. Clifford and J. Ausloos, “Data Protection and the Role of Fairness”, available at https://ssrn.com/abstract=3013139 (last accessed 5 November 2020).
55 GDPR, art. 1(3) and recitals 2–6, 13; Lynskey, Foundations, 46–88.
56 E. Posner and R.M. Hynes, “The Law and Economics of Consumer Finance”, available at https://ssrn.com/abstract=261109 (last accessed 5 November 2020).
57 J. Stiglitz and A. Weiss, “Credit Rationing in Markets with Imperfect Information” (1981) 71(3) American Economic Review 393; L. Einav, M. Jenkins and J. Levin, “The Impact of Credit Scoring on Consumer Lending” (2013) 44 RAND Journal of Economics 249.
58 J. Stiglitz and A. Weiss, “Asymmetric Information in Credit Markets and Its Implications for Macro-economics” (1992) 44 Oxford Economic Papers 694.
59 W. Adams, L. Einav and J. Levin, “Liquidity Constraints and Imperfect Information in Subprime Lending” (2009) 99(1) American Economic Review 49.
60 Experian, “5.8m Are Credit Invisible, and 2.5m Are Excluded from Finance by Inaccurate Data. How Data and Analytics Can Include All”, available at https://bit.ly/38AS9QQ (last accessed 5 November 2020).
61 A. Fuster et al., “Predictably Unequal? The Effects of Machine Learning on Credit Markets”, available at https://ssrn.com/abstract=3072038; J. Jagtiani and C. Lemieux, “The Roles of Alternative Data and Machine Learning in Fintech Lending: Evidence from the Lending Club Consumer Platform”, available at https://ssrn.com/abstract=3178461 (both last accessed 5 November 2020).
62 Berg et al., “On the Rise of Fintechs”; J. Jagtiani and C. Lemieux, “Do Fintech Lenders Penetrate Areas that Are Underserved by Traditional Banks”, available at https://ssrn.com/abstract=3178459 (last accessed 5 November 2020).
63 S. Barocas and A. Selbst, “Big Data's Disparate Impact” (2016) 104 Calif.L.Rev. 671, 677–93; J. Kleinberg et al., “Human Decisions and Machine Predictions” (2018) 133 Quarterly Journal of Economics 237.
64 S. Regan et al., “Model Behaviour: Nothing Artificial – Emerging Trends in the Validation of Machine Learning and Artificial Intelligence Models”, available at https://accntu.re/2HQcFzi; P. Bracke et al., “Machine Learning Explainability in Finance: An Application to Default Risk Analysis”, available at https://bit.ly/2TyIk0d (both last accessed 5 November 2020).
65 J. Danielsson, R. Macrae and A. Uthemann, “Artificial Intelligence and Systemic Risk”, available at http://dx.doi.org/10.2139/ssrn.3410948 (last accessed 5 November 2020).
66 M. Adelson, “A Journey to the Alt-A Zone: A Brief Primer on Alt-A Mortgage Loans”, available at https://bit.ly/2U5O2af (last accessed 5 November 2020).
67 J. Jagtiani, L. Lambie-Hanson and T. Lambie-Hanson, “Fintech Lending and Mortgage Credit Access”, available at https://doi.org/10.21799/frbp.wp.2019.47 (last accessed 5 November 2020), 1.
68 J. Hirshleifer, “The Private and Social Value of Information and the Reward to Inventive Activity” (1971) 61(4) American Economic Review 561.
69 Posner and Hynes, “Law and Economics of Consumer Finance”; Armour et al., Principles, 207–12.
70 O. Bar-Gill, “Algorithmic Price Discrimination: When Demand Is a Function of Both Preferences and (Mis)Perceptions” (2019) 86 U.Chi.L.Rev. 217; FCA, “Price Discrimination in Financial Services: How Should We Deal With Questions of Fairness?”, available at https://bit.ly/2W783jl (last accessed 5 November 2020).
71 R. Calo, “Digital Market Manipulation” (2014) 82 George Washington Law Review 995; G. Wagner and H. Eidenmüller, “Down by Algorithms? Siphoning Rents, Exploiting Biases and Shaping Preferences: The Dark Side of Personalized Transactions” (2019) 86 U.Chi.L.Rev. 581.
72 A. Kurakin, I. Goodfellow and S. Bengio, “Adversarial Machine Learning at Scale”, available at https://arxiv.org/abs/1611.01236 (last accessed 5 November 2020).
73 A. Acquisti, “The Economics and Behavioural Economics of Privacy” in J. Lane et al. (eds.), Privacy, Big Data, and the Public Good: Frameworks for Engagement (Cambridge 2014), 83–84.
74 M.S. Gal and O. Aviv, “The Competitive Effects of the GDPR” (2020) 16 Journal of Competition Law and Economics 349.
75 Berg et al., “On the Rise of Fintechs”, 34–35.
76 O. Khan, “Financial Exclusion and Ethnicity: An Agenda for Research and Policy Action”, available at https://bit.ly/31aJofv (last accessed 5 November 2020).
77 A. Roussi, “Kenyan Borrowers Shamed by Debt Collectors Chasing Silicon Valley Loans”, Financial Times, available at https://on.ft.com/2FtPY95 (last accessed 5 November 2020).
78 S. Deku, A. Kara and P. Molyneux, “Exclusion and Discrimination in the Market for Consumer Credit” (2016) 22 European Journal of Finance 941 (finding evidence of discrimination in consumer credit against non-White households in the UK).
79 Equality Act 2010, s. 13 (prohibition on direct discrimination).
80 J. Kleinberg et al., “Discrimination in the Age of Algorithms” (2018) 10 Journal of Legal Analysis 113.
81 Equality Act 2010, s. 19 (prohibition on indirect discrimination).
82 Barocas and Selbst, “Big Data's Disparate Impact”, 681–87.
83 US Bureau for Consumer Financial Protection, “Request for Information Regarding Use of Alternative Data and Modeling Techniques in the Credit Process”, available at https://bit.ly/2IMH7NK (last accessed 5 November 2020), 1186.
84 R. Bartlett et al., “Consumer Lending Discrimination in the Fintech Era”, available at https://ssrn.com/abstract=3063448 (last accessed 5 November 2020).
85 Fuster et al., “Predictably Unequal?”.
86 See also T.B. Gillis, “False Dreams of Algorithmic Fairness: The Case of Credit Pricing”, available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3571266 (last accessed 5 November 2020), 37–40.
87 P. Swire, “Financial Privacy and the Theory of High-Tech Government Surveillance” (1999) 77 Washington University Law Quarterly 461, 473–75.
88 On the subjective-objective dichotomy, see R. Calo, “The Boundaries of Privacy Harm” (2011) Indiana Law Journal 86.
89 Douglas Merrill, CEO of ZestAI in Q. Hardy, “Just the Facts: Yes, All of Them” (2012), New York Times, available at https://nyti.ms/37QQmuj (last accessed 5 November 2020).
90 N.G. Packin and Y. Lev Aretz, “On Social Credit and the Right to Be Unnetworked” (2016) Columbia Business Law Review 339.
91 N. Aggarwal, “Big Data and the Obsolescence of Consumer Credit Reports”, available at https://www.law.ox.ac.uk/business-law-blog/blog/2019/07/big-data-and-obsolescence-consumer-credit-reports (last accessed 5 November 2020); S. Wachter and B. Mittelstadt, “A Right to Reasonable Inferences: Re-thinking Data Protection Law in the Age of Big Data and AI” (2019) 2 Columbia Business Law Review 494, 505–14.
92 F.A. Pasquale, The Black Box Society: The Secret Algorithms that Control Money and Information (Cambridge, MA 2015); S. Zuboff, The Age of Surveillance Capitalism (London 2019).
93 R. Calo, “Privacy and Markets: A Love Story” (2016) 91 Notre Dame Law Review 649; Bar-Gill, “Algorithmic Price Discrimination”.
94 R.B. Avery, K.P. Brevoort and G.B. Canner, “Does Credit Scoring Produce a Disparate Impact?” (2012) 40 Real Estate Economics S65, 2.
95 For related quantitative approaches to resolving value trade-offs in ML, see e.g. J. Kleinberg, S. Mullainathan and M. Raghavan, “Inherent Trade-offs in the Fair Determination of Risk Scores”, available at https://arxiv.org/abs/1609.05807 (last accessed 5 November 2020); E. Rolf et al., “Balancing Competing Objectives with Noisy Data: Score-based Classifiers for Welfare-aware Machine Learning”, available at https://arxiv.org/abs/2003.06740 (last accessed 5 November 2020).
96 I. Goldberg, “Privacy Enhancing Technologies for the Internet III: Ten Years Later”, in A. Acquisti et al. (eds.), Digital Privacy: Theory, Technologies and Practices (New York 2007).
97 “Tor Project”, available at https://www.torproject.org/ (last accessed 5 November 2020).
98 M. Gal and N. Elkin-Koren, “Algorithmic Consumers” (2017) 30(2) Harvard Journal of Law and Technology 309; FCA, “Applying Behavioural Economics at the Financial Conduct Authority”, available at https://bit.ly/33ghiit (last accessed 5 November 2020).
99 “Bloom”, available at https://bloom.co/ and “Mydex”, available at https://mydex.org/ (both last accessed 5 November 2020).
100 D.J. Solove, “Privacy Self-management and the Consent Dilemma” (2013) 126 Harv.L.Rev. 1880.
101 “Why Google Collects Data”, available at https://policies.google.com/privacy?hl=en-US#whycollect (last accessed 5 November 2020); K.J. Strandburg, “Monitoring, Datafication, and Consent: Legal Approaches to Privacy in the Big Data Context” in Lane et al., Privacy, 30.
102 O. Ben-Shahar and C. Schneider, More Than You Wanted to Know: The Failure of Mandated Disclosure (Princeton 2014).
103 A. Acquisti, “The Economics of Personal Data and the Economics of Privacy”, available at https://bit.ly/32JAaX6 (last accessed 5 November 2020), 25ff.
104 A. Kahn, “The Tyranny of Small Decisions: Market Failures, Imperfections and the Limits of Economics” (1966) 19 International Review for Social Sciences 23.
105 I.N. Cofone, “Nothing to Hide, but Something to Lose” (2020) 70 U.T.L.J. 64.
106 A. Seelye et al., “Computer Mouse Movement Patterns: A Potential Marker of Mild Cognitive Impairment” (2015) 1 Alzheimers Dement (Amst) 472.
107 O. Ben-Shahar, “Data Pollution” (2019) 11 Journal of Legal Analysis 104; S. Barocas and H. Nissenbaum, “Big Data's End Run Around Anonymity and Consent” in Lane et al., Privacy, 44–75.
108 B. Mittelstadt, “From Individual to Group Privacy in Big Data Analytics” (2017) 30 Philosophy & Technology 475.
109 K.J. Strandburg, “Free Fall: The Online Market's Consumer Preference Disconnect” (2013) University of Chicago Legal Forum 95.
110 L. Edwards and M. Veale, “Slave to the Algorithm” (2017) 16 Duke Law and Technology Review 18, 67 (discussing the “transparency fallacy”).
111 Gal and Elkin-Koren, “Algorithmic Consumers”, 329; Wagner and Eidenmüller, “Down by Algorithms?”, 588–89.
112 R. Binns and V. Gallo, “Data Minimization and Privacy Preserving Techniques in AI Systems”, available at https://bit.ly/31cVftq (last accessed 5 November 2020).
113 See, for example, “Hazy”, available at https://hazy.com/industries (last accessed 5 November 2020).
114 N. Statt, “Apple Updates Safari's Anti-tracking Tech with Full Third-party Cookie Blocking”, The Verge, at https://bit.ly/2GXWTZ0 (last accessed 5 November 2020).
115 On policy trade-offs in the regulation of fintech more generally, see Y. Yadav and C. Brummer, “Fintech and the Innovation Trilemma” (2019) 107 Georgetown Law Journal 235.
116 Lynskey, Foundations, 76ff.
117 GDPR, arts. 35, 36.
118 GDPR, arts. 25, 28(1) (indirectly extending the obligation to data processors).
119 GDPR, art. 6(1)(f) and recital 47.
120 ICO, “How Do We Apply Legitimate Interests in Practice?”, available at https://bit.ly/32a8gEt (last accessed 5 November 2020).
121 A. Mantelero, “Comment to Article 35 and 36” in M. Cole and F. Boehm (eds.), Commentary on the General Data Protection Regulation (Cheltenham 2019); Article 29 Data Protection Working Party (A29), “Statement on the Role of a Risk-based Approach in Data Protection Legal Frameworks”, available at https://bit.ly/3nUYqzu (last accessed 5 November 2020).
122 Edwards and Veale, “Slave to the Algorithm”, 80.
123 Article 35(7), GDPR.
124 L. Bygrave, “Minding the Machine v2.0: The EU General Data Protection Regulation and Automated Decision-Making” in K. Yeung and M. Lodge (eds.), Algorithmic Regulation (Oxford 2019), 257.
125 J. Black, “Forms and Paradoxes of Principles-based Regulation” (2008) 3(4) Capital Markets Law Journal 425.
126 Article 29 Data Protection Working Party, “Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing Is ‘Likely to Result in High Risk’ for the Purposes of Regulation 2016/679”, available at https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=611236; ICO, “Data Protection Impact Assessments”, available at https://bit.ly/3nJlPUH and ICO, “Guidance on AI and Data Protection”, available at https://bit.ly/35ZckJ3 (all last accessed 5 November 2020).
127 See Surveillance Camera Commissioner, “Guidance: Data Protection Impact Assessments for Surveillance Cameras”, https://bit.ly/2SNQeTk (last accessed 5 November 2020).
128 See GDPR, Recital 75 and art. 35(3)(a); A29, “Guidelines on DPIA”, 8–12.
129 ICO, “Guidance on AI and Data Protection” (discussing different metrics of statistical accuracy and model performance).
130 Dwork, C. et al. , “Calibrating Noise to Sensitivity in Private Data Analysis”, in Halevi, S. and Rabin, T. (eds.), Theory of Cryptography: Third Theory of Cryptography Conference (Berlin and New York 2006)Google Scholar; Kearns, M. and Roth, A., The Ethical Algorithm (Oxford 2020), 22–47Google Scholar.
131 For initial guidance, see ICO, “General Data Protection Regulation (GDPR) FAQs for Small Financial Service Providers”, available at https://bit.ly/3iRWAf9 (last accessed 5 November 2020).
132 CONC 5.2A.20R to CONC 5.2A.25G; CONC 5.5A.21R to CONC 5.5A.26G.
133 ICO, “ICO Takes Enforcement Action against Experian after Data Broking Investigation”, available at https://bit.ly/3oBu2um (last accessed 5 November 2020).
134 ICO, “Action We've Taken”, available at https://ico.org.uk/action-weve-taken/ (last accessed 5 November 2020).
135 For a related discussion in the US context, see Janger, E., “Locating the Regulation of Data Privacy and Data Security” (2010) 5 Brooklyn Journal of Corporate, Financial and Commercial Law 97Google Scholar.
136 The FCA can impose fines of up to 20 per cent of a firm's revenue (plus disgorgement); the ICO can impose fines up to 4 per cent of revenue (or EUR 20 million, whichever is higher).
137 Building on ICO and FCA, “Memorandum of Understanding Between the Information Commissioner and the Financial Conduct Authority”, at https://bit.ly/2H2ujVS (last accessed 5 November 2020).
138 Hirsch, D., “The Law and Policy of Online Privacy: Regulation, Self-regulation or Co-regulation?” (2010) 34 Seattle University Law Review 439Google Scholar.
139 GDPR, art. 40.
140 Lending Standards Board, “The Standards of Lending Practice”, available at https://bit.ly/2STkgW8 (last accessed 5 November 2020).
141 L. Gambacorta et al., “How do Machine Learning and Non-traditional Data Affect Credit Scoring? New Evidence from a Chinese Fintech Firm” (2019) BIS Working Papers No. 834, available at https://www.bis.org/publ/work834.pdf (last visited 5 November 2020), 19–20.
142 On the commodification objection, see Sandel, M.J., What Money Can't Buy: The Moral Limits of Markets (London 2012)Google Scholar.
143 GDPR, art. 9.
144 Awrey, D., “Regulating Financial Innovation: A More Principles-based Approach?” (2011) 5 Brooklyn Journal of Corporate, Financial and Commercial Law 273Google Scholar.
145 Mittelstadt, “From Individual Privacy to Group Privacy”.
146 Bygrave, “Minding the Machine”, 257.