«Systems So Perfect That No One Will Need to Be Good»? RegTech and the "Human Factor"

di Federico Panisi, Andrea Perrone *

Download pdf []
parole chiave: [ Information technology ] [ Supervisory process ] [ Human factor ] [ RegTech policy ]

RegTech - the use of information technology in the context of supervisory processes - is nothing new, but only a further step in the interactive evolution of finance and information technology. By reducing information asymmetries and their related costs, RegTech allows financial institutions to comply with regulation more efficiently and supervisory authorities to enhance their capacity for deterrence. However, RegTech also presents important perils for the financial system, namely: (1) risks related to technology vulnerability (operational and cybersecurity risks) and (2) automation biases that weaken overall personal responsibility and decision-making effectiveness and encourage financial institutions to privilege organizational self-interest over sound management. Moreover, RegTech raises imbalances in resource allocation effectiveness between financial institutions and supervisory authorities. While the former can easily invest in RegTech, the latter are bound by operative constraints and funding discontinuities. In the presence of these issues, the most efficient policy for addressing the evolution underway in supervisory processes is investing in the "human factor." By (re-)striking the balance between technology and humans, the perils of RegTech are contained for the good of the whole financial system: supervisory processes are subjected to accountable safeguards against technological breakdowns, and human judgment and personal responsibility in decision-making are preserved.


1. The use of technology in supervisory processes: the benefits of RegTech

Finance and information technology (IT) have always been closely connected. In the aftermath of the Global Financial Crisis (GFC), FinTech - the new financial industry that applies technology to improve financial activities - is moving this relationship an important step forward[1]. In recent years, as part of this effort, the word RegTech (a syneresis of "Regulatory Technology") has been adopted by the industry to refer to applications «focused on developing common technological solutions to regulatory processes»[2]. Simply put, RegTech consists of the use of cutting-edge digital technologies in the context of financial monitoring, reporting, and compliance[3].

The origins of RegTech have already been analyzed by some legal scholars. They have clearly highlighted that, although RegTech developed within the traditional financial industry in response to post-GFC regulation, it is now attracting the attention of supervisory authorities engaged in managing genuine, complex and dynamic financial systems[4]. This adoption of RegTech by public authorities has recently been referred to by the term "Supervisory Technology" or SupTech[5]. The legal scholarship has highlighted that RegTech is both a great opportunity and an immediately necessary "must do" for supervisory authorities if they are to effectively exercise their powers and accomplish their mandate to act as financial watchdogs. Indeed, it has been stated that «just as finance is rapidly becoming automated, so too must financial regulation»[6] and, consequently, that «regulators must invest heavily in the development of proportionate, data-driven regulation in order to deal effectively with innovation without compromising their mandate»[7].

The benefits of RegTech for financial institutions have been clearly delineated in the literature and mainly consist of opportunities to implement financial regulation while reducing both the size of compliance staff (as well as overlapping compliance "silos" in favor of more flexibility within the business organization) and fostering an alignment between business operations and compliance functions[8]. Moreover, it has also been stressed that RegTech enhances risk management, ensuring a qualitative improvement of compliance processes, and ensures that financial institutions are provided with consistent enterprise-wide datasets, while decreasing the overall amount of manual paperwork, which comes with certain risks. Of course, such datasets further reduce information asymmetries across business units and provide those in the organization performing data mining and analytics with accurate and reliable data that can be leveraged to correctly automate many operational decisions. In addition, the opportunity offered by RegTech to better access (and assess) data, as well as constantly allowing its reorganization, are essential to tailored reporting and, consequently, to the presentation of «different slices of [a financial institution] so that it conforms with different regulator preferences, including financial accounting regimes, incident-report requirements, and electronic-filing formats»[9].

RegTech also reduces information asymmetries between financial institutions and supervisory authorities and offers the latter «the potential of continuous monitoring capacity, providing close to real-time insights, […] into the functioning of the markets […] looking forward to identify problems in advance rather than simply taking enforcement action after the fact»[10]. In other words, equipped with technologies that enhance their ability to scrutinize business operations and risk management and that help them to understand market operations, abuses, and risks, supervisory authorities can exercise a more efficient, effective, and, most of all, prophylactic deterrence function[11]. In such a context, on the one hand, market integrity is better safeguarded, and, on the other, the potential for supervisory actions (and the related misconduct costs borne by financial institutions) substantially decreases[12].

2. The risks of RegTech: technology vulnerability and automation bias

The above-mentioned benefits are just one side of the RegTech coin. The other side relates to certain risks it carries. Two different areas of risks can be outlined: one related to IT itself, and the other related to the consequences of constant and intense reliance on IT in decision-making processes.

2. .1 Technology vulnerability

As with any other technology, IT is a product of human creativity and, consequently, it is imperfect. Such imperfection makes IT highly vulnerable both to errors and to hacking attacks. Consequently, in the context of IT, both operational and cybersecurity risks should be carefully addressed.

More specifically, IT operational risk presents itself when, for any reason, errors are introduced into or develop within any aspect of a computer system, from its design to its testing to its operation within the organization in which it has been implemented. Because IT errors happen very frequently, it «is widely acknowledged that there is no such thing as flawless software» or, in other words, that «software always has bugs»[13]. Moreover, the probability that an IT system is affected by bugs (or other errors) increases proportionally with the complexity of the system. For this reason, some computer experts have highlighted that current IT - RegTech included, of course - is, «in many ways, far less reliable and more prone to bugs than it was in the past»[14]. Moreover, it is well known that, if IT operational risks materialize, their consequences can be «catastrophic», depending both on the seriousness of the bugs and the interconnections that link the affected code with other computer systems[15].

Likewise, IT raises cybersecurity issues because computer systems are vulnerable to hacking[16]. Certainly, efforts to resist ill-intentioned third-parties are always ongoing, but the threat that hacking will seriously disrupt financial systems is constantly rising, which can be seen especially in the significant and damaging security breaches and data thefts resulting from computer hackers that are reported daily[17]. As RegTech increases both overall reliance on computers and IT interconnections within financial markets, the challenge of implementing secure safeguards against cyberattacks becomes ever more important. It has been pointed out that this challenge confronts both financial institutions and supervisory authorities. Consequently, it is likely that their alignment of interests will be leveraged to ensure that all financial players can benefit from the best firewalls available[18].

2. .2 Decision-making effectiveness and automation bias

IT is far from being completely neutral from a human point of view. Indeed, designing IT systems always implies evaluative choices, such as those regarding the data and descriptive features to use when composing a dataset or the set of assumptions defining the model selection criteria of machine learning algorithms[19]. These choices inevitably embed both the cognitive biases and other "heuristic" subjectivities that unconsciously characterize the human mind and that generally «permit [it] to make efficient decisions even in situations of uncertainty»[20]. Although these biases and subjectivities are masked by the complex infrastructure of IT, they directly affect the outcomes of the overall process. Moreover, understanding how much the final outcomes are biased is far from easy, since this depends on the likelihood of being able to understand the procedures by which a computer system arrives at a particular outcome. In some cases - commonly referred to as "black boxes" - this can be extremely hard, if not impossible, even for IT experts[21]. As has been recently pointed out, an emphasis on transparency - the remedy most often claimed to be the best way to deal with situations of complexity and opacity - can help but does not solve this problem[22].

Therefore, if not accompanied by adequate awareness and properly managed, IT over-reliance can deeply affect human judgment, which tends to completely disregard these cognitive biases and subjectivities and also to consider computer-generated solutions to always be correct. Alternative or contradictory information is not taken into consideration and a further, secondary bias arises. This has been called «automation bias»[23]. Empirical studies have demonstrated that this bias is «most pronounced [both] when […] technology fails to flag a problem» and when computer-prompted outcomes comport «with the financial interest of the decisionmaker», supporting the opinion that automation biases both weaken personal responsibility and decision-making effectiveness and affect organizational choices in privileging self-interest over sound management[24].

Automation biases in supervisory processes can cause financial institutions to inadvertently underestimate risks as well as intentionally opt in favor of excessively risky decisions, justifying these decisions by embracing complex algorithms and computer-generated solutions as authoritative support for them. However, algorithms cannot be considered accountable per se[25] and, consequently, can obscure «the accountability of the decisions they channel [driving the risk of masking] important concerns with a veneer of transparency». Of course, this problem expands for «regulators outside the firm, who frequently lack the resources or vantage to peer inside buried decision processes and must instead rely on the resulting conclusions about risks and safeguards offered them by the parties they regulate»[26]. The result is an increased risk of both "moral hazard" and "gaming the system".

It must also be stressed that supervisory authorities are not exempt from the risk that automation biases will negatively affect their decision-making outcomes and that the potential misuse of technology will support (or justify) overly deterrent policies, which can compromise financial systems similarly to the ways in which these biases affect financial firms. Indeed, the negative effects of automation biases can burden financial institutions with unnecessary deterrent pressures and, ultimately, both market efficiency and profitability can be harmed. In addition, there is a risk that adopting RegTech will induce supervisory authorities to act beyond the scope of their mandate in an arbitrary or capricious manner and, consequently, a strict application of rule-of-law principles (for instance, "reason-giving") is imperative[27].

3. RegTech investments and imbalances in resource allocation effectiveness between financial institutions and supervisory authorities

In addition to introducing important risk-related concerns into supervisory processes, RegTech also raises other concerns about resource allocation effectiveness. As can be easily understood, the implementation of complex RegTech architectures inherently requires considerable financial resources. Unfortunately, there is a significant imbalance between the abilities of financial institutions and supervisory authorities to afford these costs.

Indeed, on the one hand, financial institutions are private, profit-driven organizations in which «those in control […] - the board and its executives - have strong incentives to maintain or strengthen operations» and, consequently, are quite willing to invest massive financial resources into RegTech, with the expectation that technological solutions will transform the cost burden today into higher profit tomorrow[28]. On the other, supervisory authorities are public institutions subject to political processes; their capacity to invest is strongly influenced both by operative constraints and funding discontinuities. Consequently, they experience many more difficulties in raising adequate funding for proper IT equipment. For this reason, they inevitably lag behind in the competition to employ RegTech. Moreover, supervisory authorities face a funding dilemma: if the available resources are insufficient, their efforts to enhance their deterrent abilities may be completely unsuccessful, but if the expenditures become excessive, the high expectations for RegTech may prove altogether illusory. Indeed, the promise of increased efficiency (in terms of expenditure savings) is immediately dampened by the cost burden on the public which must support the investments[29].

For these reasons, supervisory authorities are often unlikely to develop RegTech projects. The development of IT systems assumes the commitment of a variety of extremely skilled teams. As has been highlighted in the literature, supervisory authorities currently lack human resources with this kind of expertise within their staffs; consequently, for IT systems to become a reality, such teams need to be formed ex novo. However, operative constraints and funding discontinuities constitute insuperable obstacles to doing so, at least until adequate non-financial incentives develop and emerge to encourage engineers and IT experts to choose careers working for supervisory authorities[30].

It is conceivable that such funding limitations would not impede supervisory authorities from opting for the cheaper and easier alternative of purchasing RegTech products from third-party vendors. Unfortunately, this would create the risk of conflicts of interest with the third-party vendors themselves, who could exploit the information gained working for supervisory authorities to favor their financial institution clients[31].

In addition, operative constraints and funding discontinuities also burden supervisory authorities with resource allocation delays that can compromise the effectiveness of their start-up investments, as well as their sustainability, which is already challenged by the high ongoing costs that IT systems inevitably imply for their management, maintenance, and updating. Such delays further exacerbate the imbalance between supervisory authorities and financial institutions, which, in contrast, can quickly adapt their resource allocation decisions to changes in their needs and to the evolution of technology[32].

In this context, a further consideration must be stressed. To reduce their resource allocation effectiveness issues and to better address the above-mentioned imbalances, supervisory authorities can surely benefit from collaboration with financial institutions. Indeed, it is worth considering that supervisory authorities can foster, for instance, the adoption of those technologies - such as Distributed Ledgers Technologies (DLTs) - that align their interest in enhanced oversight through supervisory processes with that of the increased efficiency always desired by the financial industry. However, it would be unrealistic to suggest that collaboration and cooperation with financial institutions is a definitive solution to these problems, as it is very likely that underlying technologies would be applied to only a few current supervisory processes.

To reduce these imbalances further, cooperation among the supervisory authorities themselves - if not some form of straightforward centralization - would seem necessary. This cooperation may result in a decisive improvement of the deterrent powers of existing supranational authorities. If this occurs, the greater financial soundness of these authorities could be leveraged to achieve better RegTech results than each one could achieve on its own.

4. Defining RegTech policy: investing in the “human factor” to (re-)strike the balance between technology and humans in supervisory processes

The picture of RegTech resulting from the analysis so far outlined illustrates both its benefits and perils, raising questions about which policy prescriptions might best address current supervisory processes while maximizing the benefits and minimizing the perils. As the focus on the resource allocation effectiveness issues affecting supervisory authorities highlights, such questions are even more crucial when the huge amount of funding needed for RegTech is absent. Indeed, the prevention of risks can greatly reduce the need for ex-post responses, and the saved financial resources could properly be re-allocated in order to achieve other goals.

To correctly answer the questions about policy, the current mainstream paradigm of "automated compliance/automated supervision" must not ignore the most important part of the equation; specifically, that financial processes are, first and foremost, inherently human: regardless of how much financial events can be rationally explained, foreseen, and computed, they continuously unfold in a world that is very often characterized by emotional, flawed, whimsical, and random behaviors[33]. Consequently, even in a context in which «[i]t is increasingly clear [that] success depends in large part on the extent to which technology is used to support decisions as well as automate them»[34] and overall financial reliance on IT increases, humans are even more needed.  The brain, «with its billions of neurons and trillions of synaptic connections, [still] remains one of the most sophisticated and powerful of all analytical machines»[35], and humans are the only entities that can effectively deal with both sides of financial processes. Thus, the main priority of a policy aimed at addressing the ongoing evolution of supervisory processes must be investing in the "human factor" to (re-)strike the balance between technology and human beings[36].

Implementing this priority inevitably involves the pursuit of two goals, both decisive for the final success of RegTech. The first represents the immediate need to hire (and train) IT-competent personnel who can properly employ sophisticated digital technologies. The second is making use of regulation to promote the development of ex-ante personal responsibility among previously identified, tech-savvy individuals operating within the world of RegTech, so that technology can be used mindfully while pursuing supervisory aims. The regulatory policy referred to as «attention regulation» has been shown to be quite effective in achieving this outcome[37]. At its core, attention regulation seeks to place «responsibility for thinking about control systems on particular officers, who must articulate the reasoning behind choices made in structuring programs and attest to their adequacy in public documents»[38], and grounding policy efforts on a similar regulatory model is expected to drive several benefits.

First, the presence of IT personnel specifically assigned to the supervision of RegTech processes ensures the presence of accountable "analog" safeguards against the innate drawbacks of technology[39]. More specifically, both constant oversight and periodic evaluations of the technology adopted ensure that the risk of technological breakdowns caused by the above-illustrated operational failures and cyberattacks is contained. In addition, this policy turns out to be highly beneficial for decision-making processes. Indeed, empowering human judgment in IT processes rather than eliminating it reduces the risk that technology will become the decision-maker instead of fulfilling its true role of supporting human decisions. Such empowerment also decreases the risk that automation biases will negatively affect the final outcomes of supervisory processes. Investing in the human presence within supervisory processes ensures that final decisions can benefit from the cognitive and practical skills that are uniquely possessed by human beings. Therefore, although inherently computer-based, decisions prompted by RegTech applications can still incorporate the soundness and effectiveness ensured by human reasoning, leading to better results[40].

For the reasons discussed above, investing in the "human factor" in the implementation of RegTech policy should be supported especially when there is a lack of funding for RegTech and resource allocation effectiveness issues emerge, as was described above in the case of supervisory authorities. At first glance, both the difference in quality between excellent and average engineers and the previously mentioned current lack of non-financial incentives to attract engineers to work for supervisory authorities raise questions about whether supervisory authorities will be able to attract, motivate, and retain a sufficient number of highly skilled IT personnel. However, as payoff is just one criterion governing choices in real daily life, these questions should be addressed more in relative than in absolute terms. Indeed, in many instances the equality in exchange between the remuneration and the services provided[41] proves to be sufficient to attract, motivate, and retain those who, for any reason, might prefer public service to a career in a private organization.

The regulatory model of attention regulation suggests that the simple presence of tech-savvy individuals is not sufficient to fully preserve the accountability of supervisory processes. Automation biases are most effectively avoided only if computer-based decisions are sufficiently reviewed and challenged, so that decision-makers can achieve an adequate degree of cognitive accountability for their choices[42]. For this reason, it is necessary that specific actions address also how decisions develop within supervisory processes. Specifically, policy efforts must encourage the establishment of procedural safeguards to ensure the reviewability of decisions as they are made. Reviewability sessions can make RegTech-based outcomes subject, at various stages, to multilateral dialogical challenges, which allow different points of view to constructively confront each other concerning final decisions and contribute to their correction and ultimate betterment. In such a context, reporting within supervisory processes becomes even more essential to wholly understanding how computer systems prompt or even automate decisions, as well as how decisions themselves unfold and are reviewed by humans[43].

In addition to ensuring the accountability of supervisory processes, choice reviewability accomplished by means of dialogical discussion and reporting increase transparency with respect to the use of RegTech applications within supervisory processes. More transparency turns out to be very beneficial both for financial institutions and for supervisory authorities because it helps to reveal and clearly identify potential pathologies affecting decision-making and, consequently, affecting its outcomes[44]. Increased transparency allows financial institutions to better correct and reduce inefficiencies in their processes. Similarly, supervisory authorities can take advantage of it as part of their focus on the culture of the businesses they regulate and on the behavior underlying decisions. Since both business culture and behavior generally presage certain risks, supervisory authorities, by understanding them, can more easily detect risks and prevent negative outcomes from being realized.

5. Conclusion

This paper focuses on RegTech, the 21st century version of the adoption of IT in supervisory processes both by financial institutions and supervisory authorities, and highlights the benefits it brings in monitoring, reporting, and compliance activities (mainly, a reduction in costs and information asymmetries).

However, the paper also warns of the illusory dream that RegTech can create «systems so perfect that no one will need to be good»[45]. Indeed, RegTech comes with perils; namely, those related to the vulnerability of technology (both operational failures and cyberattacks) and the risk that automation biases induced by over-reliance on IT will compromise both management soundness and the effectiveness of decision-making processes, with potentially detrimental effects for the whole financial system.

In addition, RegTech raises significant resource allocation effectiveness issues related to the evident imbalances between well-funded and investment-oriented financial institutions and supervisory authorities, whose willingness to invest in RegTech is limited by both operative constraints and funding discontinuities. As the paper underlines, these imbalances highlight important questions about how supervisory authorities will be able to keep up with the evolution underway in supervisory processes, for instance about their capacity to employ personnel with a sufficient amount of IT expertise in the years to come.

These questions cause commentators to wonder about proper policy measures to best address RegTech, emphasizing its benefits while containing its risks. The paper proposes that investing in the "human factor" is both a priority and a quite effective policy because it (re-)strikes the balance between technology and human beings, whose unique skills are becoming more needed and important in the current tech-driven financial context. Moreover, the paper outlines that this policy option aims not only at hiring and training more engineers and computer experts, but also at promoting their personal responsibility for containing the risk of technological breakdowns, and at preserving decision-making effectiveness and the final accountability of supervisory processes. To achieve this goal, multilateral dialogical discussion and reporting are essential. Consequently, these must be incentivized instead of reduced or eliminated.

Thanks to these suggested policy efforts, it can be expected that the risks of RegTech will be sufficiently contained. Although the financial system as a whole remains inherently imperfect, it will be able to benefit from these further steps in its evolving interaction with information technology.

* Federico Panisi - Ph.D. Student, Università degli Studi di Brescia. E-mail: f.panisi@unibs.it.
Andrea Perrone - Full Professor of Corporate Law and Securities Regulation, Università Cattolica del Sacro Cuore. E-mail: andrea.perrone@unicatt.it.

The paper is fully co-authored. However, for any relevant legal purposes, secc. 2, 3, and 4 should be credited to Federico Panisi, while secc. 1 and 5 should be credited to Andrea Perrone.


  • 1) P. Schueffel, Taming the Beast: A Scientific Definition of FinTech, in 4 Journal of Innovation Management (JIM) (2016), 4, 45; D.W. Arner, J. Barberis, R.P. Buckley, The Evolution of FinTech: A New Post-Crisis Paradigm, 47 Geo. J. Int'l L. (2016), 1271 ff.
  • 2) Institute Of International Finance, REGTECH: Exploring Solutions for Regulatory Challenges, (October 2015), https://www.iif.com/system/files/regtech-exploring-solutions-for-regulatory-challenges.pdf, 2.
  • 3) D.W. Arner, J. Barberis, R.P. Buckley, FinTech, RegTech, and the Reconceptualization of Financial Regulation, 37 Nw. J. Int'l L. & Bus. (2017), 373; D.W. Arner, J. Barberis, R.P. Buckley, FinTech and RegTech in a Nutshell, and the Future in a Sandbox, (2017), https://www.cfapubs.org/doi/pdf/10.2470/rfbr.v3.n4.1, 2.
  • 4) D.W. Arner, J. Barberis, R.P. Buckley, FinTech and RegTech in a Nutshell, (n. 3), 14 ss.; L.G. Baxter, Adaptive Financial Regulation and RegTech: A Concept Article on Realistic Protection for Victims of Bank Failures, 66 Duke L.J. (2016), 573; A.G. Haldane, Managing global finance as a system, (2014), https://www.bis.org/review/r141030f.pdf, 3.
  • 5) Basel Committee On Banking Supervision, Consultative Document. Sound Practices: Implications of fintech developments for banks and bank supervisors, Bank for International Settlements, (August 2017), https://www.bis.org/bcbs/publ/d415.pdf, 34; Financial Stability Board, Financial Stability Implications from FinTech. Supervisory and Regulatory Issues that Merit Authorities' Attention, (June 2017), http://www.fsb.org/wp-content/uploads/R270617.pdf, 34.
  • 6) L.G. Baxter, (n. 4), 598.
  • 7) D.W. Arner, J. Barberis, R.P. Buckley, FinTech and RegTech in a Nutshell, (n. 3), 16.
  • 8) K.A. Bamberger, Technologies of Compliance: Risk and Regulation in a Digital Age, 88 Tex. L. Rev.(2010), 685 ff.
  • 9) K.A. Bamberger, (n. 8), 702.
  • 10) D.W. Arner, J. Barberis, R.P. Buckley, FinTech, RegTech, (n. 3), 382.
  • 11) O.H. Dombalagian, Preserving Human Agency in Automated Compliance, 11 Brook. J. Corp. Fin. & Com. L. (2016), 71 ff.
  • 12) D.W. Arner, J. Barberis, R.P. Buckley, FinTech, RegTech, (n. 3), 383.
  • 13) A. Walch, The Bitcoin Blockchain as Financial Market Infrastructure: A Consideration of Operational Risk, 18 N.Y.U. J. Legis. & Pub. Pol'y (2015), 856.
  • 14) C. Le Goues, S. Forrest, W. Weimer, The Case for Software Evolution, (2010), http://www.cs.cmu.edu/~clegoues/docs/legoues-foser10.pdf, 205.
  • 15) A. Walch, (n. 13), 857.
  • 16) D.E. Bambauer, Ghost in The Network, 162 U. Pa. L. Rev., (2014), 1011 ff.
  • 17) A. Walch, (n. 13), 859.
  • 18) L. Enriques, Financial Supervisors and RegTech: Four Roles and Four Challenges, (2018), available at https://ssrn.com/abstract=3087292, 9.
  • 19) J.D. Kelleher, B. Mac Namee, A. D'arcy, Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies, Cambridge MA, The MIT Press, 2015, 17.
  • 20) K.A. Bamberger, Regulation as Delegation: Private Firms, Decisionmaking, And Accountability in the Administrative State, 56 Duke L.J. (2006), 411.
  • 21) J.D. Kelleher, B. Mac Namee, A. D'arcy, (n. 19), 522.
  • 22) J.A. Kroll, J. Huey, S. Barocas, E.W. Felten, J.R. Reidenberg, D.G. Robinson, H. Yu, Accountable Algorithms, 165 U. Pa. L. Rev. (2017), 23.
  • 23) K.A. Bamberger, (n. 8), 670; M.L. Cummings, Automation and Accountability in Decision Support System Interface Design, 32 The Journal of Technology Studies (2006), 25.
  • 24) K.A. Bamberger, (n. 8), 712.
  • 25) J.A. Kroll, J. Huey, S. Barocas, E.W. Felten, J.R. Reidenberg, D.G. Robinson, H. Yu, (n. 22).
  • 26) K.A. Bamberger, (n. 8), 727.
  • 27) C. Coglianese, D. Lehr, Regulating by Robot: Administrative Decision Making in the Machine-Learning Era, 105 Geo. L.J. (2017), 1207.
  • 28) R. Van Loo, 66 Rise of the Digital Regulator, Duke L.J. (2017), 1302.
  • 29) R. Van Loo, (n. 28), 1302.
  • 30) L. Enriques, (n. 18), 5 ff.
  • 31) L. Enriques, (n. 18), 5.
  • 32) R. Van Loo, (n. 28), 1302 ff.
  • 33) O.H. Dombalagian, (n. 11), 85.
  • 34) K.A. Bamberger, (n. 8), 737.
  • 35) T.C.W. Lin, Compliance, Technology, and Modern Finance, 11 Brook. J. Corp. Fin. & Com. L. (2016), 180.
  • 36) O.H. Dombalagian, (n. 11), 85; T.C.W. Lin, (n. 35), 182.
  • 37) K.A. Bamberger, (n. 8), 737; K.A. Bamberger, (n. 20), 447 ff.
  • 38) K.A. Bamberger, (n. 20), 449.
  • 39) T.C.W. Lin, (n. 35), 180.
  • 40) K.A. Bamberger, (n. 8), 736 ff.
  • 41) A. Perrone, The Just Price Doctrine and Contemporary Contract Law:  Some Introductory Remarks, Rivista ODC (2013), available at /edizioni/2013/3/saggi/perrone-the-just-price-doctrine-and-contemporary-contract-law-some-introductory-remarks/, 14.
  • 42) K.A. Bamberger, (n. 20), 450.
  • 43) K.A. Bamberger, (n. 8), 737 f.
  • 44) K.A. Bamberger, (n. 20), 450.
  • 45) T.S. Eliot, The Complete Poems and Plays of T.S. Eliot, London, Faber and Faber, 1969, 159.