Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-v9fdk Total loading time: 0 Render date: 2024-11-10T05:30:54.182Z Has data issue: false hasContentIssue false

19 - From Corporate Governance to Algorithm Governance

Artificial Intelligence as a Challenge for Corporations and Their Executives

from Part VI - Responsible Corporate Governance of AI Systems

Published online by Cambridge University Press:  28 October 2022

Silja Voeneky
Affiliation:
Albert-Ludwigs-Universität Freiburg, Germany
Philipp Kellmeyer
Affiliation:
Medical Center, Albert-Ludwigs-Universität Freiburg, Germany
Oliver Mueller
Affiliation:
Albert-Ludwigs-Universität Freiburg, Germany
Wolfram Burgard
Affiliation:
Technische Universität Nürnberg

Summary

This chapter explores the changes that AI brings about in corporate law and corporate governance, especially in terms of the challenges it poses for corporations. The law scholar Jan Lieder argues that whilst there is the potential to enhance the current system, there are also risks of destabilisation. Although algorithms are already being used in the board room, lawmakers should not consider legally recognizing e-persons as directors and managers. Rather, academia should evaluate the effects of AI on the corporate duties of boards and their liabilities. By critically examining three main topics, algorithms as directors, AI in a management board, and AI in a supervisory board, the author suggests the need for transparency in a company’s practices regarding AI for awareness-raising and the enhancement of overall algorithm governance, as well as the need for boards to report on their overall AI strategy and ethical guidelines relating to the responsibilities, competencies, and protective measures they established. Additionally, the author argues that a reporting obligation should require the boards to deal with questions of individual rights and explain how they relate to them.

Type
Chapter
Information
The Cambridge Handbook of Responsible Artificial Intelligence
Interdisciplinary Perspectives
, pp. 331 - 346
Publisher: Cambridge University Press
Print publication year: 2022
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

I. Introduction

Every generation has its topic: The topic of our generation is digitalization. At present, we are all witnessing the so-called industrial revolution 4.0.Footnote 1 This revolution is characterized by the use of a whole range of new digital technologies that can be combined in a variety of ways. Keywords are self-learning algorithms, Artificial Intelligence (AI), autonomous systems, Big Data, biometrics, cloud computing, Internet of Things, mobile internet, robotics, and social media.Footnote 2

The use of digital technologies challenges the law and those applying it. The range of questions and problems is tremendously broad.Footnote 3 Widely discussed examples are self-driving cars,Footnote 4 the use of digital technologies in corporate finance, credit financing and credit protection,Footnote 5 the digital estate,Footnote 6 or online dispute resolution.Footnote 7 In fact, digital technologies challenge the entire national legal system including public and criminal law as well as EU and international law. Some even say we may face ‘the beginning of the end for the law’.Footnote 8 In fact, this is not the end, but rather the time for a digital initiative. This chapter focuses on the changes that AI brings about in corporate law and corporate governance, especially in terms of the challenges for corporations and their executives.

From a conceptual perspective, AI applications will have a major impact on corporate law in general and corporate governance in particular. In practice, AI poses a tremendous challenge for corporations and their executives. As algorithms have already entered the boardroom, lawmakers must consider legally recognizing e-persons as directors and managers. The applicable law must deal with effects of AI on corporate duties of boards and their liabilities. The interdependencies of AI, delegation of leadership tasks, and the business judgement rule as a safe harbor for executives are of particular importance. A further issue to be addressed is how AI will change the decision-making process in corporations as a whole. This topic is closely connected with the board’s duties in Big Data and Data Governance as well as the qualifications and responsibilities of directors and managers.

By referring to AI, I mean information technology systems that reproduce or approximate various cognitive abilities of humans.Footnote 9 In the same breath, we need to distinguish between strong AI and weak AI. Currently, strong AI does not exist.Footnote 10 There is no system really imitating a human being, such as a so-called superintelligence. Only weak AI is applied today. These are single technologies for smart human–machine interactions, such as machine learning or deep learning. Weak AI focuses on the solution of specific application problems based on the methods from math and computer science, whereby the systems are capable of self-optimization.Footnote 11

By referring to corporate governance, I mean a system by which companies are directed and controlled.Footnote 12 In continental European jurisdictions, such as Germany, a dual board structure is the prevailing system with a management board running the day-to-day business of the firm and a supervisory board monitoring the business decisions of the management board. In Anglo-American jurisdictions, such as the United States (US) and the United Kingdom (UK), the two functions of management and supervision are combined within one unitary board – the board of directors.Footnote 13

II. Algorithms As Directors

The first question is, “Could and should algorithms act as directors?” In 2014, newspapers reported that a venture capital firm had just appointed an algorithm to its board of directors. The Hong Kong based VC firm Deep Knowledge Ventures was supposed to have appointed an algorithm called Vital (an abbreviation for Validating Investment Tool for Advancing Life Sciences) to serve as a director with full voting rights and full decision-making power over corporate measures.Footnote 14 In fact, Vital only had an observer and adviser status with regard to the board members, which are all natural persons.Footnote 15

Under German law according to sections 76(3) and 100(1)(1) AktG,Footnote 16 the members of the management board and the supervisory board must be natural persons with full legal capacity. Not even corporations are allowed to serve as board members. That means, in order to appoint algorithms as directors, the law must be changed.Footnote 17 Actually, the lawmaker could legally recognize e-persons as directors. However, the lawmaker should not do so, because there is a reason for the exclusion of legal persons and algorithms under German law. Both lack personal liability and personal accountability for the management and the supervision of the company.Footnote 18

Nevertheless, the European Parliament enacted a resolution with recommendations to the Commission on Civil Law Rules on Robotics, and suggested therein

creating a specific legal status for robots in the long run, so that at least the most sophisticated autonomous robots could be established as having the status of electronic persons responsible for making good any damage they may cause, and possibly applying electronic personality to cases where robots make autonomous decisions or otherwise interact with third parties independently.Footnote 19

The most fundamental requirement for legally recognizing an e-person would be its own liability – either based on an ownership fund or based on a mandatory liability insurance. In case corporations are appointing AI entities as directors (or apply it otherwise), they should be strictly liable for damages caused by AI applications in order to mitigate the particular challenges and potential risks of AI.Footnote 20 This is because strict liability would not only delegate the risk assessment and thus control the level of care and activity, but would also create an incentive for further developing this technology.Footnote 21 At the same time, creditors of the company should be protected by a compulsory liability insurance, whereas piercing the corporate veil, that is, a personal liability of the shareholders, must remain a rare exception.Footnote 22 However, at an international level, regulatory competition makes it difficult to guarantee comparable standards. Harmonization can only be expected (if ever) in supranational legal systems, such as the European Union.Footnote 23 In this context, it is noteworthy that the EU Commission’s White Paper on AI presented in 2020 does not address the questions of the legal status of algorithms at all.Footnote 24

However, even if we were to establish such a liability safeguard, there is no self-interested action of an algorithm as long as there is no strong AI. True, circumstances may change in the future due to technological progress. However, there is a long and winding road to the notorious superintelligence.Footnote 25 Conversely, weak AI only carries out actions in the third-party interest of people or organizations, and is currently not in a position to make its own value decisions and judgemental considerations.Footnote 26 In the end, current algorithms are nothing more than digital slaves, albeit slaves with superhuman abilities. In addition, the currently applicable incentive system of corporate law and governance would have to be adapted to AI directors, because – unlike human directors – duties of loyalty can hardly be applied to them, but rather they decide according to algorithmic models.Footnote 27 At present, only humans have original creative power, only they are capable of making decisions and acting in the true sense of the word.Footnote 28

III. Management Board

Given the current limitations of AI, we will continue to have to get by with human directors for the next few decades. Although algorithms do not currently appear suitable for making independent corporate decisions, AI can nonetheless support human directors in their management and monitoring tasks. AI is already used in practice to analyze and forecast the financial development of a company, but also to identify the need for optimization in an entrepreneurial value chain.Footnote 29 In addition, AI applications are used in the run-up to mergers and acquisitions (M&A) transactions,Footnote 30 namely as part of due diligence, in order to simplify particularly labor-intensive processes when checking documents. Algorithms are also able to recognize unusual contract clauses and to summarize essential parameters of contracts, even to create contract templates themselves.Footnote 31 Further examples for the use of AI applications are cybersecurityFootnote 32 and compliance management systems.Footnote 33

1. Legal Framework

With regard to the German corporate governance system, the management board is responsible for running the company.Footnote 34 Consequently, the management board also decides on the overall corporate strategy, the degree of digitalization and the use of AI applications.Footnote 35 The supervisory board monitors the business decisions of the management board, decides on the approval of particularly important measures,Footnote 36 as well as on the appointment and removal of the management board members;Footnote 37 whereas the shareholders meeting does not determine a company’s digitalization structures.Footnote 38

2. AI Related Duties

In principle, the use of AI neither constitutes a violation of corporate law or the articles of association,Footnote 39 nor is it an expression of bad corporate governance. Even if the use of AI is associated with risks, it is difficult to advise companies – as the safest option – to forego it completely.Footnote 40 Instead, the use of AI places special demands on the management board members.

a. General Responsibilities

Managers must have a fundamental understanding of the relevant AI applications, of their potentials, suitability, and risks. However, the board members do not need to have in-depth knowledge about the detailed functioning of a certain AI application. In particular, the knowledge of an IT expert cannot be demanded, nor a detailed examination of the material correctness of the decision.Footnote 41 Rather, they need to have an understanding of the scope and limits of an application and possible results and outcomes of the application in order to perform plausibility checks to prevent incorrect decisions quickly and effectively.Footnote 42 The management board has to ensure, through test runs, the functionality of the application with regard to the concrete fulfilment of tasks in the specific company environment.Footnote 43 If, according to the specific nature of the AI application, there is the possibility of an adjustment to the concrete circumstances of the company, for example, with regard to the firm’s risk profile or statutory provisions, then the management board is obliged to carry out such an adjustment.Footnote 44 During the use of the AI, the board of directors must continuously evaluate and monitor the working methods, information procurement, and information evaluation as well as the results achieved.

The management board must implement a system that eliminates, as far as possible, the risks and false results that arise from the use of AI. This system must assure that anyone who uses AI knows the respective scope of possible results of an application so that it can be determined whether a concrete result is still within the possible range of results. However, that can hardly be determined abstractly, but requires a close look at the concrete AI application. Furthermore, the market standard is to be included in the analysis. If all companies in a certain industry use certain AI applications that are considered safe and effective, then an application by other companies will rarely prove to breach a management board’s duty of care.

Under these conditions, the management board is allowed to delegate decisions and tasks to an AI application.Footnote 45 This is not contradicted by the fact that algorithms lack legal capacity, because in this context the board’s own duties are decisive.Footnote 46 In any event, a blanket self-commitment to the results of an AI application is incompatible with the management responsibility and personal accountability of the board members.Footnote 47 At all times, the applied AI must be manageable and controllable in order to ensure that no human loss of control occurs and the decision-making process is comprehensible. The person responsible for applying AI in a certain corporate setting must always be able to operate the off-switch. In normative terms, this requirement is derived from section 91(2) AktG, which obliges the management board to take suitable measures to identify, at an early stage, developments that could jeopardize the continued existence of the company.Footnote 48 In addition, the application must be protected against external attacks, and emergency precautions must be implemented in the event of a technical malfunction.Footnote 49

b. Delegation of Responsibility

The board may delegate the responsibility for applying AI to subordinate employees, but it is required to carefully select, instruct, and supervise the delegate.Footnote 50 Under the prevailing view, core tasks, however, cannot be delegated, as board members are not allowed to evade their leadership responsibility.Footnote 51 Such non-delegable management tasks of the management board include basic measures with regard to the strategic direction, business policy and the organization of the company.Footnote 52 The decision as to whether and to what extent AI should be used in the company is also a management measure that cannot be delegated under the prevailing view.Footnote 53 Only the preparation of decisions by auxiliary persons is permissible, as long as the board of directors makes the decision personally and of its own responsibility. In this respect, the board is responsible for the selection of AI use and the application of AI in general. The board has to provide the necessary information, must exclude conflicts of interest and has to perform plausibility checks of the results obtained. Furthermore, the managers must conduct an ongoing monitoring and ensure that the assigned tasks are properly performed.

c. Data Governance

AI relies on extensive data sets (Big Data). In this respect, the management board is responsible for a wide scope and high quality of the available data, for the suitability and training of AI applications, and for the coordination of the model predictions with the objectives of the respective company.Footnote 54 In addition, the board of directors must observe data protection law limitsFootnote 55 and must pursue a non-discriminatory procedure.Footnote 56 If AI use is not in line with these regulations or other mandatory provisions, the management board violates the duty of legality.Footnote 57 In this case, the management board does not benefit from the liability privilege of the business judgement rule.Footnote 58

Apart from that, the management board has an entrepreneurial discretion with regard to the proper organization of the company’s internal knowledge organization.Footnote 59 The starting point is the management board’s duty to ensure a legal, statutory, and appropriate organizational structure.Footnote 60 The specific scope and content of the obligation to organize knowledge depends largely on the type, size, and industry of the company and its resources.Footnote 61 However, if, according to these principles, there is a breach of the obligation to store, forward, and actually query information, then the company will be considered to have acted with knowledge or negligent ignorance under German law.Footnote 62

d. Management Liability

If managers violate these obligations (and do not benefit from the liability privilege of the business judgement rule)Footnote 63, they can be held liable for damages to the company.Footnote 64 This applies in particular in the event of an inadmissible or inadequate delegation.Footnote 65 In order to mitigate the liability risk for management board members, they have to ensure that the whole framework of AI usage in terms of specific applications, competences, and responsibilities as well as the AI-related flow of information within the company is well designed and documented in detail. Conversely, board members are not liable for individual algorithmic errors as long as (1) the algorithm works reliably, (2) the algorithm does not make unlawful decisions, (3) there are no conflicts of interest, and (4) the AI’s functioning is fundamentally overseen and properly documented.Footnote 66

Comprehensive documentation of the circumstances that prompted the management board to use a certain AI and the specific circumstances of its application reduces the risk of being sued for damages by the company. This ensures, in particular, that the members of the management board can handle the burden of proof incumbent on them according to section 93(2)(2) AktG. They will achieve this better, the more detailed the decision-making process regarding the use of AI can be understood from the written documents.Footnote 67 This kind of documentation by the management board is to be distinguished from general documentation requirements discussed at the European and national level for the development of AI models and for access authorization to this documentation, the details of which are beyond the scope of this chapter.Footnote 68

e. Composition of the Management Board

In order to cope with the challenges that the use of AI applications causes, the structure and composition of the management and the board has already changed significantly. That manifests itself in the establishment of new management positions, such as a Chief Information Officer (CIO)Footnote 69 or a Chief Digital Officer (CDO).Footnote 70 Almost half of the 40 largest German companies have such a position at board level.Footnote 71

In addition, soft factors are becoming increasingly important in corporate management. Just think of the damage to the company’s reputation, which is one of the tangible economic factors of a company today.Footnote 72 Under the term Corporate Digital Responsibility (CDR), specific responsibilities are developing for the use of AI and other digital innovations.Footnote 73 For example, Deutsche Telekom AG has enacted nine guidelines for responsible AI in a corporate setting. SAP SE established an advisory board for responsible AI consisting of experts from academia, politics, and the industry. These developments, of course, have an important influence on the overall knowledge attribution within the company and a corporate group. AI and Big Data make information available faster and facilitate the decision-making process at board level. Therefore, the management board must examine whether the absence of any AI application in the information gathering and decision-making process is in the best interest of a company. However, a duty to use AI applications only exists in exceptional cases and depends on the market standard in the respective industry. The greater the amount of data to be managed and the more complex and calculation-extensive the decisions in question, the more likely it is that the management board will be obliged to use AI.Footnote 74

3. Business Judgement Rule

This point is closely connected with the application of the business judgement rule as a safe harbour for AI use. Under the general concept of the business judgement rule that is well-known in many jurisdictionsFootnote 75, as it is in Germany according to section 93(1)(2) AktG, a director cannot be held liable for an entrepreneurial decision if there is no conflict of interest and she had good reason to assume she was acting based on adequate information and for the benefit of the company.

a. Adequate Information

The requirement of adequate information depends significantly on the ability to gather and analyse information. Taking into account all the circumstances of the specific individual case, the board of directors has a considerable amount of leeway to judge which information is to be obtained from an economic point of view in the time available and to be included in the decision-making process. Neither a comprehensive nor the best possible, but only an appropriate information basis is necessary.Footnote 76 In addition, the appropriateness is to be assessed from the subjective perspective of the board members (‘could reasonably assume’), so that a court is effectively prevented during the subsequent review from substituting its own understanding of appropriateness for the subjective assessment of the decision-maker.Footnote 77 In the context of litigation, a plausibility check based on justifiability is decisive.Footnote 78

In general, the type, size, purpose, and organization of the company as well as the availability of a functional AI and the data required for operation are relevant for answering the question of the extent to which AI must be used in the context of the decision-making preparation based on information. The cost of the AI system and the proportionality of the information procurement must also be taken into account.Footnote 79 If there is a great amount of data to be managed and a complex and calculation-intensive decision to be made, AI and Big Data applications are of major importance and the members of the management board will hardly be able to justify not using AI.Footnote 80 Conversely, the use of AI to obtain information is definitely not objectionable.Footnote 81

b. Benefit of the Company

Furthermore, the board of directors must reasonably assume to act in the best interest of the company when using AI. This criterion is to be assessed from an ex ante perspective, not ex post.Footnote 82 According to the mixed-subjective standard, it depends largely on the concrete perception of the acting board members at the time of the entrepreneurial decision.Footnote 83 In principle, the board of directors is free to organize the operation of the company according to its own ideas, as long as it stays within the limits of the best interest of the corporationFootnote 84 that are informed solely by the existence and the long-term and sustainable profitability of the company.Footnote 85 Only when the board members act in a grossly negligent manner or take irresponsible risks do they act outside the company’s best interest.Footnote 86 Taking all these aspects into account, the criterion of acceptability proves to be a suitable benchmark.Footnote 87

In the specific decision-making process, all advantages and disadvantages of using or delegating the decision to use AI applications must be included and carefully weighed against one another for the benefit of the company. In this context, however, it cannot simply be seen as unacceptable and contrary to the welfare of the company that the decisions made by or with the support of AI can no longer be understood from a purely human perspective.Footnote 88 On the one hand, human decisions that require a certain originality and creativity cannot always be traced down to the last detail. On the other hand, one of the major potentials of AI is to harness particularly creative and original ideas in the area of corporate management. AI can, therefore, be used as long as its use is not associated with unacceptable risks. The business judgement rule allows the management board to consciously take at least justifiable risks in the best interest of the company.

However, the management board may also conclude that applying AI is just too much of a risk for the existence or the profitability of the firm and therefore may refrain from it without taking a liability risk under section 93(1)(2) AktG.Footnote 89 The prerequisite for this is that the board performs a conscious act of decision-making.Footnote 90 Otherwise, acting in good faith for the benefit of the company is ruled out a priori. This decision can also consist of a conscious toleration or omission.Footnote 91 The same applies to intuitive action,Footnote 92 even if in this case the other requirements of section 93(1)(2) AktG must be subjected to a particularly thorough examination.Footnote 93 Furthermore, in addition to the action taken, there must have been another alternative,Footnote 94 even if only to omit the action taken. Even if the decision makers submit themselves to an actual or supposed necessity,Footnote 95 they could at least hypothetically have omitted the action. Apart from that, the decision does not need to manifest itself in a formal act of forming a will; in particular, a resolution by the collective body is not a prerequisite. Conversely, with a view to a later (judicial) dispute, it makes sense to sufficiently document the decision.Footnote 96

c. Freedom from Conflicts of Interest

The executive board must make the decision for or against the use of AI free of extraneous influences and special interests.Footnote 97 The business judgement rule does not apply if the board members are not solely guided by the points mentioned above, but rather pursue other, namely self-interested goals. If the use of AI is not based on inappropriate interests and the board of directors has not influenced the parameters specified for the AI in a self-interested manner, the use of AI applications can contribute to a reduction of transaction costs from an economic point of view and mitigate the principle-agent-conflict, as the interest of the firm will be aligned with decisions made by AI.Footnote 98 That is, AI can make the decision-making process (more) objective.Footnote 99 However, in order to achieve an actually objective result, the quality of the data used is decisive. If the data set itself is characterized by discriminatory or incorrect information, the result will also suffer from those weaknesses (‘garbage in – garbage out’). Moreover, if the management board is in charge of developing AI applications inside the firm, it may have an interest in choosing experts and technology designs that favor its own benefit rather than the best interest of the company. This development could aggravate the principle-agent-conflict within the large public firm.Footnote 100

IV. Supervisory Board

For this reason, it will also be of fundamental importance in the future to have an institutional monitoring body in the form of the supervisory board, which enforces the interests of the company as an internal corporate governance system. With regard to the monitoring function, there is a distinction to be made as to whether the supervisory board makes use of AI itself while monitoring and advising the management of the company, or whether the supervisory board is monitoring and advising with regard to the use of AI by the management board.

1. Use of AI by the Supervisory Board Itself

As the members of the management board and of the supervisory board have to comply with the same basic standards of care and responsibility under sections 116(1) and 93(1)(1) AktG, the management board’s AI related dutiesFootnote 101 essentially apply to the supervisory board accordingly. If the supervisory board is making an entrepreneurial decision, it can also rely on the business judgement rule.Footnote 102 This is true, for example, for the granting of approval for transactions requiring approval under section 111(4)(2) AktG, with regard to M&A transactions.Footnote 103 Furthermore, the supervisory board may use AI based personality and fitness checks when it appoints and dismisses management board members.Footnote 104 AI applications can help the supervisory board to structure the remuneration of the management board appropriately. They can also be useful for the supervisory board when auditing the accounting and in the compliance area, because they are able to analyze large amounts of data and uncover inconsistencies.Footnote 105

2. Monitoring of the Use of AI by the Management Board

When it comes to the monitoring and advice on the use of AI by the management board, the supervisory board has to fulfil its general monitoring obligation under section 111(1) AktG. The starting point is the reporting from the management board under section 90 AktG.Footnote 106 Namely strategic decisions on the leading guidelines of AI use is part of the intended business policy or at least another fundamental matter regarding the future conduct of the company’s business according to section 90(1)(1) AktG. Furthermore, the usage of certain AI applications may be qualified as transactions that may have a material affect upon the profitability or liquidity of the company under section 90(1)(4) AktG. In this regard, the management board does not need to derive and trace the decision-making process of the AI in detail. Rather, it is sufficient for the management board to report to the supervisory board about the result found and how it specifically used the AI, monitored its functions, and checked the plausibility of the result.Footnote 107 In addition, pursuant to section 90(3) AktG, the supervisory board may require at any time a report from the management board on the affairs of the company, on the company’s legal and business relationships with affiliated enterprises. This report may also deal with the AI related developments on the management board level and in other entities in a corporate group.

Finally, the supervisory board may inspect and examine the books and records of the company according to section 111(2)(1) AktG. It is undisputed that this also includes electronic recordings,Footnote 108 which the supervisory board can examine using AI in the form of a big data analysis.Footnote 109 Conversely, the supervisory board does not need to conduct its own inquiries using its information authority without sufficient cause or in the event of regular and orderly business development.Footnote 110 Contrary to what the literature suggests,Footnote 111 this applies even in the event that the supervisory body has unhindered access to the company’s internal management information system.Footnote 112 The opposing view not only disregards the principle of a trusting cooperation between the management board and the supervisory board, but also surpasses the demands on the supervisory board members in terms of time.Footnote 113

With a view to the monitoring standard, the supervisory board has to assess the management board’s overall strategy as regards AI applications and especially systemic risks that result from the usage of AI in the company. This also comprises the monitoring of the AI-based management and organizational structure of the company.Footnote 114 If it recognizes violations of AI use by the management board, the supervisory board has to intervene using the general means of action. This may start with giving advice to the management board on how to optimize the AI strategy. Furthermore, the supervisory board may establish an approval right with regard to the overall AI-based management structure. In addition, the supervisory board may draw personnel conclusions and install an AI expert on the management board level such as a CIO or CDO.Footnote 115

V. Conclusion

AI is not the end of corporate governance as some authors predicted.Footnote 116 Rather, AI has the potential to change the overall corporate governance system significantly. As this chapter has shown, AI has the potential to improve corporate governance structures, especially when it comes to handling big data sets. At the same time, it poses challenges to the corporate management system, which must be met by carefully adapting the governance framework.Footnote 117 However, currently, there is no need for a strict AI regulation with a specific focus on corporations.Footnote 118 Rather, we see a creeping change from corporate governance to algorithm governance that has the potential to enhance, but also the risks to destabilize the current system. What we really need is the disclosure of information about a company’s practices with regard to AI application, organization, and oversight as well as potentials and risks.Footnote 119 This kind of transparency would help to raise awareness and to enhance the overall algorithm governance system. For that purpose, the already mandatory corporate governance report that many jurisdictions require, such as the US,Footnote 120 the UKFootnote 121 and Germany,Footnote 122 should be supplemented with additional explanations on AI.Footnote 123

In this report, the management board and the supervisory board should report on their overall strategy with regard to the use, organization, and monitoring of AI applications. This specifically relates to the responsibilities, competencies, and protective measures they established to prevent damage to the corporation. In addition, the boards should also be obliged to report on the ethical guidelines for a trustworthy use of AI.Footnote 124 In this regard, they may rely on the proposals drawn up on an international level. Of particular importance in this respect are the principles of the European Commission in its communication on ‘Building Trust in Human-Centric Artificial Intelligence’,Footnote 125 as well as the ‘Principles on Artificial Intelligence’ published by the OECD.Footnote 126 These principles require users to comply with organizational precautions in order to prevent incorrect AI decisions, provide a minimum of technical proficiency, and ensure the preservation of human final decision-making authority. In addition, there is a safeguarding of individual rights, such as privacy, diversity, non-discrimination, fairness, and an orientation of AI to the common good, including sustainability, ecological responsibility, and overall societal and social impact. Even if these principles are not legally binding, a reporting obligation requires the management board and supervisory board to deal with the corresponding questions and to explain how they relate to them. It will make a difference and may lead to improvements if companies and their executives are aware of the importance of these principles in dealing with responsible AI.

Footnotes

1 For details, see Bundesministerium für Bildung und Forschung, ‘Zukunftsbild Industrie 4.0’ (BMBF, 30 December 2020) www.bmbf.de/bmbf/de/forschung/digitale-wirtschaft-und-gesellschaft/industrie-4-0/industrie-4-0.html; P Bräutigam and T Klindt, ‘Industrie 4.0, das Internet der Dinge und das Recht’ (2015) 16 NJW 1137 (hereafter Bräutigam and Klindt, ‘Industrie 4.0’); T Kaufmann, Geschäftsmodelle in Industrie 4.0 und dem Internet der Dinge (2015); Schwab, Die Vierte Industrielle Revolution (2016); more reserved HJ Schlinkert, ‘Industrie 4.0 – wie das Recht Schritt hält’ (2017) 8 ZRP 222 et seq.

2 Cf. A Börding and others, ‘Neue Herausforderungen der Digitalisierung für das deutsche Zivilrecht’ (2017) 2 CR 134; J Bormann, ‘Die digitalisierte GmbH’ (2017) 46 ZGR 621, 622; B Paal, ‘Die digitalisierte GmbH’ (2017) 46 ZGR 590, 592, 599 et seq.

3 For digitalization of private law, see, e.g., K Langenbucher, ‘Digitales Finanzwesen’ (2018) 218 AcP 385 et seq.; G Teubner, ‘Digitale Rechtssubjekte?’ (2018) 218 AcP 155 et seq. (hereafter Teubner, ‘Rechtssubjekte’); cf. further M Fries, ‘PayPal Law und Legal Tech – Was macht die Digitalisierung mit dem Privatrecht?’ (2016) 39 NJW 2860 et seq (hereafter Fries, ‘Digitalisierung Privatrecht’).

4 For details, see, e.g., H Eidenmüller, ‘The Rise of Robots and the Law of Humans’ (2017) 4 ZEuP 765 et seq.; G Spindler, ‘Zukunft der Digitalisierung – Datenwirtschaft in der Unternehmenspraxis’ (2018) 1–2 DB 41, 49 et seq. (hereafter Spindler, ‘Zukunft’).

5 For details, see A Hildner and M Danzmann, ‘Blockchain-Anwendungen für die Unternehmensfinanzierung’ (2017) CF 385 et seq.; M Hüther and M Danzmann, ‘Der Einfluss des Internet of Things und der Industrie 4.0 auf Kreditfinanzierungen’ (2017) 15–16 BB 834 et seq.; R Nyffenegger and F Schär‚ ‘Token Sales: Eine Analyse Des Blockchain-Basierten Unternehmensfinanzierungsinstruments’ (2018) CF 121 et seq.; B Westermann, ‘Daten als Kreditsicherheiten – eine Analyse des Datenwirtschaftsrechts de lege lata und de lege ferenda aus Sicht des Kreditsicherungsrechts’ (2018) 26 WM 1205 et seq.

6 Cf. BGHZ 219, 243 (Bundesgerichtshof III ZR 183/17); A Kutscher, Der digitale Nachlass (2015); J Lieder and D Berneith, ‘Digitaler Nachlass: Das Facebook-Urteil des BGH’ (2018) 10 FamRZ 1486; C BudzikiewiczDigitaler Nachlass’ (2018) 218 AcP 558 et seq.; H Ludgya, ‘Digitales Update für das Erbrecht im BGB?’ (2018) 1 ZEV 1 et seq.; C Sorge, ‘Digitaler Nachlass als Knäuel von Rechtsverhältnissen’ (2018) 6 MMR 372 et seq.; see also Deutscher Bundestag, ‘Kleine Anfrage der Abgeordneten Roman Müller-Böhm et al. BT-Drucks. 19/3954’ (2018); as to this J Lieder and D Berneith, ‘Digitaler Nachlass – Sollte der Gesetzgeber tätig warden?’ (2020) 3 ZRP 87 et seq.

7 For details, see Fries, ‘Digitalisierung Privatrecht’ (Footnote n 3) 2681 et seq.; M GruppLegal Tech – Impulse für Streitbeilegung und Rechtsdienstleistung’ (2014) 8+9 AnwBl. 660 et seq.; J Wagner, ‘Legal Tech und Legal Robots in Unternehmen und den sie beratenden Kanzleien’ (2017) 17 BB 898, 900 (hereafter Wagner, ‘Legal Tech’).

8 V Boehme-Neßler, ‘Die Macht der Algorithmen und die Ohnmacht des Rechts’ (2017) 42 NJW 3031.

9 For definitions, see, e.g., M Herberger, ‘“Künstliche Intelligenz” und Recht – Ein Orientierungsversuch’ (2018) 39 NJW 2825 et seq.; C Schael, ‘Künstliche Intelligenz in der modernen Gesellschaft: Bedeutung der “Künstlichen Intelligenz” für die Gesellschaft’ (2018) 42 DuD 547 et seq.; J Armour and H Eidenmüller, ‘Selbstfahrende Kapitalgesellschaften?’ (2019) 2–3 ZHR 169, 172 et seq. (hereafter Armour and Eidenmüller, ‘Kapitalgesellschaften’); F Graf von Westphalen, ‘Definition der Künstlichen Intelligenz in der Kommissionsmitteilung COM (2020) 64 final – Auswirkungen auf das Vertragsrecht’ (2020) 35 BB 1859 et seq. (hereafter Graf von Westphalen, ‘Definition’); P Hacker, ‘Europäische und nationale Regulierung von Künstlicher Intelligenz’ (2020) 30 NJW 2142 et seq. (hereafter Hacker, ‘Regulierung’).

10 Cf. U Noack, ‘Organisationspflichten und -strukturen kraft Digitalisierung’ (2019) 183 ZHR 105, 107 (hereafter Noack, ‘Organisationspflichten’); U Noack, ‘Der digitale Aufsichtsrat’ in B Grunewald, J Koch, and J Tielmann (eds), Festschrift für Eberhard Vetter (2019) 497, 500 (hereafter Noack, ‘Aufsichtsrat’); for a different use of this wording, see L Strohn, ‘Die Rolle des Aufsichtsrats beim Einsatz von Künstlicher Intelligenz’ (2018) 182 ZHR 371 et seq. (hereafter Strohn, ‘Rolle’).

11 See Deutscher Bundestag, ‘Antwort der Bundesregierung, Erarbeitung einer KI-Strategie der Bundesregierung, BT-Drucks. 19/5678’ (2018) 2.

12 Cf. UH Schneider and C Strenger, Die “Corporate Governance-Grundsätze” der Grundsatzkommission Corporate Governance (German Panel on Corporate Governance) (2000) 106, 107; R Marsch-Barner, ‘§ 2 Corporate Governance marginal number 2.1’ in R Marsch-Barner and F Schäfer (eds), Handbuch börsennotierte AG (4th ed. 2018); J Koch, ‘§ 76 margin number 37’ in U Hüffer and J Koch (eds) Aktiengesetz (14th ed. 2020); HJ Böcking and L Bundle, ‘§ 2 marginal number 6’ in KJ Hopt, JH Binder, and HJ Böcking (eds), Handbuch Corporate Governance von Banken und Versicherungen (2nd ed. 2020); A v Werder, ‘DCGK Präambel marginal number 10’ in T Kremer and others (eds), Deutscher Corporate Governance Kodex (8th ed. 2021).

13 For a comparative overview, see J Lieder, ‘Der Aufsichtsrat im Wandel der Zeit’ (2006) 636 et seq. (hereafter Lieder ‘Aufsichtsrat’).

14 R Wile, ‘A Venture Capital Firm Just Named an Algorithm to Its Board of Directors’ (Business Insider, 13 May 2014) www.businessinsider.com/vital-named-to-board-2014-5?r=US&IR=T.

15 See N Burridge, ‘Artificial Intelligence gets a seat in the boardroom’ (Nikkei Asia, 10 May 2017) https://asia.nikkei.com/Business/Artificial-intelligence-gets-a-seat-in-the-boardroom.

16 Aktiengesetz (AktG) = Stock Corporation Act of 6 September 1965, Federal Law Gazette I, 1089. For the English version that has been used in this paper, see Rittler, German Corporate Law (2016) as well as Norton Rose Fullbright, ‘German Stock Coroporation Act (Aktiengesetz)’ (Norton Rose Fullbright, 10 May 2016) www.nortonrosefulbright.com/-/media/files/nrf/nrfweb/imported/german-stock-corporation-act.pdf.

17 Cf. H Fleischer, ‘§ 93 marginal number 129’ in M Henssler (ed), BeckOGK Aktiengesetz (15 January 2020); F Möslein, ‘Digitalisierung im Gesellschaftsrecht: Unternehmensleitung durch Algorithmen und künstliche Intelligenz?’ (2018) 5 ZIP 204, 207 et seq. (hereafter Möslein, ‘Digitalisierung’); Strohn, ‘Rolle’(Footnote n 10) 371; R Weber, A Kiefner, and S Jobst, ‘Künstliche Intelligenz und Unternehmensführung’ (2018) 29 NZG 1131 (1136) (hereafter ‘Weber, Kiefner, and Jobst ‘Unternehmensführung’); see further H Fleischer, ‘Algorithmen im Aufsichtsrat’ (2018) 9 Der Aufsichtsrat 121 (hereafter Fleischer, ‘Algorithmen’); A Sattler, ‘Der Einfluss der Digitalisierung auf das Gesellschaftsrecht’ (2018) 39 BB 2243, 2248 (hereafter Sattler, ‘Einfluss’); Wagner, ‘Legal Tech’ (Footnote n 7) 1098.

18 See B Kropff, Begründung zum Regierungsentwurf zum Aktiengesetz 1965 (1965) 135: ‘Der Entwurf gestattet es nicht, juristische Personen zu wählen, weil die Überwachungspflicht die persönliche Tätigkeit einer verantwortlichen Person voraussetzt.’ Cf. further Lieder, ‘Aufsichtsrat’ (Footnote n 13) 367 et seq.

19 M Delvaux, ‘Report with Recommendations to the Commission on Civil Law Rules on Robotics (2015/2103 (INL))’ (European Parliament, 27 January 2015) www.europarl.europa.eu/doceo/document/A-8-2017-0005_EN.html; see, e.g., MF Lohmann, ‘Ein europäisches Roboterrecht – überfällig oder überflüssig?’ (2017) 6 ZRP 168; R Schaub, ‘Interaktion von Mensch und Maschine’ (2017) 7 JZ 342, 345 et seq.; J-E Schirmer, ‘Rechtsfähige Roboter?’ (2016) 13 JZ 660 et seq.; J Taeger, ‘Die Entwicklung des IT-Rechts im Jahr 2016’ (2016) 52 NJW 3764.

20 Cf. Bräutigam and Klindt, ‘Industrie 4.0’ (Footnote n 1) 1138; Teubner, ‘Rechtssubjekte’ (Footnote n 3) 184; DA Zetzsche, ‘Corporate Technologies – Zur Digitalisierung im Aktienrecht’ (2019) 1-02 AG 1 (10) (hereafter Zetzsche, ‘Technologies’).

21 Cf. G Borges, ‘Haftung für selbstfahrende Autos’ (2016) 4 CR 272, 278; H Kötz and G Wagner, Deliktsrecht (13th ed. 2016) marginal number 72 et seq.; H Zech, ‘Künstliche Intelligenz und Haftungsfragen’ (2019) 2 ZfPW 198, 214.

22 Cf. Armour and Eidenmüller, ‘Kapitalgesellschaften’ (Footnote n 9) 185 et seq.

23 Ibid, 186 et seq.

24 European Commission, ‘White Paper On Artificial Intelligence – A European approach to excellence and trust, COM(2020) 65 final’ (EUR-Lex, 19 February 2020) https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=COM:2020:65:FIN; see Graf von Westphalen, ‘Definition’ (Footnote n 9) 1859 et seq.; Hacker, ‘Regulierung’ (Footnote n 9), 2142 et seq.

25 Cf. further B Schölkopf, ‘Der Mann, der den Computern das Lernen beibringt’ Frankfurter Allgemeine Zeitung (26 February 2020): “Wir sind extrem weit davon entfernt, dass seine Maschine intelligenter ist als ein Mensch.”; L Enriques and DA Zetzsche, ‘Corporate Technologies and the Tech Nirvana Fallacy’ (2019) ECGI Law Working Paper N° 457/2019, 58 https://ecgi.global/sites/default/files/working_papers/documents/finalenriqueszetzsche.pdf (hereafter Enriques and Zetzsche, ‘Corporate Technologies’): “Only if and when humans relinquish corporate control to machines, may the problem at the core of corporate governance be solved; but by then humans will have more pressing issues to worry about than corporate governance.”

26 Cf. further P Krug, Haftung im Rahmen der Anwendung von künstlicher Intelligenz: Betrachtung unter Berücksichtigung der Besonderheiten des steuerberatenden Berufsstandes (2020) 74, 76; Möslein, ‘Digitalisierung’ (Footnote n 17) 207; Noack, ‘Aufsichtsrat’ (Footnote n 10) 506.

27 Möslein, ‘Digitalisierung’ (Footnote n 17) 206.

28 Cf. M Auer, ‘Der Algorithmus kennt keine Moral’ Frankfurter Allgemeine Zeitung (29 April 2020).

29 On this and the following, see Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1131.

30 Cf. Noack, ‘Organisationspflichten’ (Footnote n 10) 119.

31 Cf. M Grub and S Krispenz, ‘Auswirkungen der Digitalisierung auf M&A Transaktionen’ (2018) 5 BB 235, 238; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1131.

32 For details, see Noack, ‘Organisationspflichten’ (Footnote n 10) 124 et seq.

33 Cf. Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1131; Noack, ‘Organisationspflichten’ (Footnote n 10) 132 et seq.; Zetzsche, ‘Technologies’ (Footnote n 20) 5.

34 Section 76(1) AktG.

35 Cf. Noack, ‘Organisationspflichten’ (Footnote n 10) 115 et seq.

36 Section 111(4) AktG.

37 Section 84 AktG.

38 Cf. Section 119 AktG.

39 Cf. Strohn, ‘Rolle’ (Footnote n 10) 371, 376.

40 But see Strohn, ‘Rolle’ (Footnote n 10) 376; rightly contested by Noack, ‘Aufsichtsrat’ (Footnote n 10) 502.

41 Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1133; cf. further J Wagner, ‘Legal Tech und Legal Robots in Unternehmen und den sie beratenden Kanzleien – Teil 2: Folgen für die Pflichten von Vorstandsmitgliedern bzw. Geschäftsführern und Aufsichtsräten’ (2018) 20 BB 1097, 1099 (hereafter Wagner, ‘Legal Tech 2’); Möslein, ‘Digitalisierung’ (Footnote n 17) 208 et seq.

42 Cf. further Sattler, ‘Einfluss’ (Footnote n 17) 2248.

43 M Becker and P Pordzik, ‘Digitale Unternehmensführung’ (2020) 3 ZfPW 334, 349 (hereafter Becker and Pordzik, ‘Unternehmensführung’).

44 On this and the following, see Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

45 Becker and Pordzik, ‘Unternehmensführung’ (Footnote n 43) 344; Noack, ‘Organisationspflichten’ (Footnote n 10) 117; O Lücke, ‘Der Einsatz von KI in der und durch die Unternehmensführung’ (2019) 35 BB 1986, 1989, and 1992 (hereafter Lücke, ‘KI’); for a different view, see V Hoch, ‘Anwendung Künstlicher Intelligenz zur Beurteilung von Rechtsfragen im unternehmerischen Bereich’ (2019) 219 AcP 648, 672.

46 Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

47 Möslein, ‘Digitalisierung’ (Footnote n 17) 208 et seq.; Wagner, ‘Legal Tech 2’ (Footnote n 41) 1098 et seq., 1101; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

48 Cf. Zetzsche, ‘Technologies’ (Footnote n 20) 7.

49 Becker and Pordzik, ‘Digitale Unternehmensführung’ (Footnote n 43) 352; D Linardatos, ‘Künstliche Intelligenz und Verantwortung’ (2019) 11 ZIP 504, 508; Lücke, ‘KI’ (Footnote n 45) 1993.

50 M Dreher, ‘Nicht delegierbare Geschäftsleiterpflichten’ in S Grundmann and others (eds), Festschrift für Klaus J. Hopt zum 70. Geburtstag (2010) 517, 536; H Fleischer, ‘§ 93 marginal number 98 et seq.’ in M Henssler (ed), BeckOGK Aktiengesetz (15 January 2020); HC Grigoleit and L Tomasic, ‘§ 93 marginal number 38’ in HC Grigoleit, Aktiengesetz (2020); Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

51 Cf. M Dreher, ‘Nicht delegierbare Geschäftsleiterpflichten’ in S Grundmann and others (eds) Festschrift für Klaus J Hopt zum 70. Geburtstag (2010) 517, 527; with a specific focus on AI use, see Möslein, ‘Digitalisierung’ (Footnote n 17) 208 et seq.; Wagner, ‘Legal Tech 2’ (Footnote n 41) 1098; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

52 HJ Mertens and A Cahn, ‘§ 76 marginal number 4’ in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010); M Weber, ‘§ 76 marginal number 8’ in W Hölters (ed), AktG (3rd ed. 2017); Weber, Kiefner and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

53 M Kort, ‘§ 76 marginal number 37’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); G Spindler, ‘Haftung der Geschäftsführung für IT-Sachverhalte’ (2017) 11 CR 715, 722; G Spindler, ‘Gesellschaftsrecht und Digitalisierung’ (2018) 47 ZGR 17, 40 et seq. (hereafter Spindler, ‘Gesellschaftsrecht’); Spindler, ‘Zukunft’ (Footnote n 4) 44; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

54 For details, see Armour and Eidenmüller, ‘Kapitalgesellschaften’ (Footnote n 9) 176 et seq.; Krug, ‘Haftung’ (Footnote n 26) 78 et seq.; cf. further Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132 et seq.

55 For details, see T Hoeren and M Niehoff, ‘KI und Datenschutz – Begründungserfordernisse automatisierter Entscheidungen’ (2018) 1 RW 47 et seq.; cf. further CS Conrad, ‘Kann die Künstliche Intelligenz den Menschen entschlüsseln? – Neue Forderungen zum Datenschutz: Eine datenschutzrechtliche Betrachtung der “Künstlichen Intelligenz”’ (2018) 42 DuD 541 et seq.; M Rost, ‘Künstliche Intelligenz: Normative und operative Anforderungen des Datenschutzes’ (2018) 42 DuD 558.

56 As to new types of discrimination risks, see JA Kroll and others, ‘Accountable Algorithms’ (2017) 165 U Pa L Rev 633, 679 et seq.; B Paal, ‘Vielfaltsicherung im Suchmaschinensektor’ (2015) 2 ZRP 34, 35; H Steege, ‘Algorithmenbasierte Diskriminierung durch Einsatz von Künstlicher Intelligenz: Rechtsvergleichende Überlegungen und relevante Einsatzgebiete’ (2019) 11 MMR 715 et seq.

57 Cf. F König, ‘Haftung für Cyberschäden: Auswirkungen des neuen Europäischen Datenschutzrechts auf die Haftung von Aktiengesellschaften und ihrer Vorstände’ (2017) 8 AG 262, 268 et seq.; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1135.

58 See infra Section III 3.

59 G Spindler and A Seidel, ‘Die zivilrechtlichen Konsequenzen von Big Data für Wissenszurechnung und Aufklärungspflichten’ (2018) 30 NJW 2153, 2154 (hereafter Spindler and Seidel, ‘Big Data’); G Spindler and A Seidel, ‘Wissenszurechnung und Digitalisierung’ in G Spindler and others (eds), Unternehmen, Kapitalmarkt, Finanzierung. Festschrift für Reinhard Marsch-Barner (2018) 549, 552 et seq.

60 Cf. BGHZ 132, 30 (37) (Bundesgerichtshof V ZR 239/94); P Hemeling, ‘Organisationspflichten des Vorstands zwischen Rechtspflicht und Opportunität’ (2011) 175 ZHR 368, 380.

61 Spindler and Seidel, ‘Big Data’ (Footnote n 59) 2154.

62 Cf. HC Grigoleit, ‘Zivilrechtliche Grundlagen der Wissenszurechnung’ (2017) 181 ZHR 160 et seq.; M Habersack and M Foerster, ‘§ 78 marginal number 39’ in H Hirte, P O Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); Sattler, ‘Einfluss’ (Footnote n 17) 2248; Spindler, ‘Wissenszurechnung in der GmbH, der AG und im Konzern’(2017) 181 ZHR 311 et seq.

63 See infra Section III 3.

64 AktG, section 93(2).

65 Cf. further Möslein, ‘Digitalisierung’ (Footnote n 17) 210 et seq.

66 Cf. Möslein, ‘Digitalisierung’ (Footnote n 17) 211.

67 Cf. Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1136; in general, see W Hölters, ‘§ 93 marginal number 36’ in W Hölters (ed), AktG (3rd ed. 2017); HJ Mertens and A Cahn, ‘§ 93 marginal number 36’ in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010); G Spindler, ‘§ 93 marginal number 58’ in W Goette and M Habersack (eds) ‘Münchener Kommentar zum AktG’ (5th ed. 2019).

68 Hacker, ‘Regulierung’ (Footnote n 9) 2143 et seq.

69 Cf. Sattler, ‘Einfluss’ (Footnote n 17) 2248.

70 Cf. M Kaspar, ‘Aufsichtsrat und Digitalisierung’ (2018) BOARD 202, 203.

71 Cf. Noack, ‘Aufsichtsrat’ (Footnote n 10) 502 et seq.

72 For details, see U Schmolke and L Klöhn, ‘Unternehmensreputation (Corporate Reputation)’ (2015) 18 NZG 689 et seq.

73 On this and the following, see Noack, ‘Organisationspflichten’ (Footnote n 10) 112 et seq.; Noack, ‘Aufsichtsrat’ (Footnote n 10) 503 et seq.; cf. further F Möslein, ‘Corporate Digital Responsibility’ in S Grundmann and others (eds), Festschrift für Klaus J Hopt zum 80. Geburtstag (2020) 805 et seq.

74 Möslein, ‘Digitalisierung’ (Footnote n 17) 209.

75 For a comparative view, see H Merkt, ‘Rechtliche Grundlagen der Business Judgment Rule im internationalen Vergleich zwischen Divergenz und Konvergenz’ (2017) 46 ZGR 129 et seq.

76 For details, see J Lieder, ‘Unternehmerische Entscheidungen des Aufsichtsrats’ (2018) 47 ZGR 523, 555 (hereafter Lieder, ‘Entscheidungen’).

77 Cf. further KJ Hopt and M Roth, ‘§ 93 marginal number 102’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); H Fleischer, ‘§ 93 marginal number 90’ in M Henssler (ed), BeckOGK, Aktiengesetz (15 January 2020); M Kock and R Dinkel, ‘Die zivilrechtliche Haftung von Vorständen für unternehmerische Entscheidungen – Die geplante Kodifizierung der Business Judgment Rule im Gesetz zur Unternehmensintegrität und Modernisierung des Anfechtungsrechts’ (2004) 10 NZG 441, 444; Lieder, ‘Entscheidungen’ (Footnote n 76) 557; for a different view, see W Goette, ‘Gesellschaftsrechtliche Grundfragen im Spiegel der Rechtsprechung’ (2008) 37 ZGR 436, 447 et seq.: purely objective approach.

78 Cf. H Fleischer in M Henssler (ed), BeckOGK, Aktiengesetz (15 January 2020) § 93 marginal number 91; J Koch in U Hüffer and J Koch (eds) Aktiengesetz (14th ed. 2020) § 93 marginal number 21; HJ Mertens and A Cahn in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010) § 93 marginal number 34; Lieder, ‘Entscheidungen’ (Footnote n 76) 557; J Redeke, ‘Zur gerichtlichen Kontrolle der Angemessenheit der Informationsgrundlage im Rahmen der Business Judgement Rule nach § 93 Abs. 1 S. 2 AktG’ (2011) 2 ZIP 59, 60 et seq.

79 Cf. Noack, ‘Organisationspflichten’ (Footnote n 10) 122.

80 Cf. Becker and Pordzik, ‘Unternehmensführung’ (Footnote n 43) 347; Möslein, ‘Digitalisierung’ (Footnote n 17) 209 et seq., 212; Sattler, ‘Einfluss’ (Footnote n 17) 2248; Spindler, ‘Gesellschaftsrecht’ (Footnote n 53) 43; Spindler, ‘Zukunft’ (Footnote n 4) 45; Weber, Kiefner, and Jobst (Footnote n 17) 1134.

81 Wagner, ‘Legal Tech 2’ (Footnote n 41) 1100; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1132.

82 Cf. Deutscher Bundestag, ‘Begründung zum Regierungsentwurf, BT-Drucks. 15/5092’ (2005) 11; T Bürgers, ‘§ 93 marginal number 15’ in T Bürgers and T Körber (eds), AktG (4th ed. 2017); B Dauner-Lieb, ‘§ 93 AktG marginal number 23’ in M Henssler and L Strohn (eds), Gesellschaftsrecht (5th ed. 2021); KJ Hopt and M Roth, ‘§ 93 marginal number 101’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); W Hölters, ‘§ 93 marginal number 39’ in W Hölters (ed), AktG (3rd ed. 2017); HJ Mertens and A Cahn, ‘§ 93 marginal number 23’ in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010); Lieder, ‘Entscheidungen’ (Footnote n 76) 577.

83 Similarly H Fleischer, ‘§ 93 marginal number 92’ in M Henssler (ed), BeckOGK, Aktiengesetz (15 January 2020); KJ Hopt and M Roth, ‘§ 93 marginal number 101’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2016); J Koch, ‘§ 93 marginal number 21’ in U Hüffer and J Koch (eds), Aktiengesetz (14th ed. 2020); Lieder, ‘Entscheidungen’ (Footnote n 76) 577; for a different opinion (objective standard): W Hölters, ‘§ 93 marginal number 39’ in W Hölters, AktG (3rd ed. 2017); HJ Mertens and A Cahn in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010) § 93 marginal number 23.

84 Cf. H Fleischer, ‘§ 76 marginal number 27’ in M Henssler (ed), BeckOGK, Aktiengesetz (15 January 2020); M Kort, ‘§ 76 marginal number 60’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG, (5th ed. 2015); G Spindler, ‘§ 76 marginal number 67 ff.’ in W Goette and M Habersack (eds) ‘Münchener Kommentar zum AktG’ (5th ed. 2019); P Ulmer, ‘Aktienrecht im Wandel’ (2002) 202 AcP 143, 158 et seq.

85 Cf. OLG Hamm AG 1995, 512, 514; B Dauner-Lieb, ‘§ 93 AktG marginal number 23’ in M Henssler and L Strohn (eds), Gesellschaftsrecht, (5th ed. 2021); J Koch, ‘§ 76 marginal number 34’ in U Hüffer and J Koch (eds) Aktiengesetz (14th ed. 2020); M Kort, ‘§ 76 marginal number 52’ in H Hirte, P O Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); HJ Mertens and A Cahn, ‘§ 76 marginal number 17’ in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010) 22; Lieder, ‘Entscheidungen’ (Footnote n 76) 577–578.

86 Vgl. BGHZ 135, 244 (253–254) (Bundesgerichtshof II ZR 175/95); H Fleischer, ‘§ 93 marginal number 99’ in M Henssler (ed), BeckOGK, Aktiengesetz (15 January 2020); J Koch, ‘§ 93 marginal number 23’ in U Hüffer and J Koch (eds), Aktiengesetz (14th ed. 2020).

87 Cf. T Drygala, ‘§ 116 marginal number 15’ in K Schmidt and M Lutter (eds), AktG (4th ed. 2020); J Koch, ‘§ 93 marginal number 23’ in U Hüffer and J Koch (eds), Aktiengesetz (14th ed. 2020); Lieder, ‘Entscheidungen’ (Footnote n 76) 578 with examples.

88 But see Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1135.

89 Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1134.

90 Deutscher Bundestag, ‘Begründung zum Regierungsentwurf, BT-Drucks. 15/5092’ (2005) 11; B Dauner-Lieb, ‘§ 93 AktG marginal number 20’ in M Henssler and L Strohn (eds), Gesellschaftsrecht (5th ed. 2021); J Koch, ‘§ 93 marginal number 16’ in U Hüffer and J Koch (eds), Aktiengesetz (14th ed. 2020); Lieder, ‘Entscheidungen’ (Footnote n 76) 532.

91 Deutscher Bundestag, ‘Begründung zum Regierungsentwurf, BT-Drucks. 15/5092’ (2005) 11; T Bürgers ‘§ 93 marginal number 15’ in T Bürgers and T Körber (eds), AktG (4th ed. 2017); H Fleischer, ‘§ 93 marginal number 97’ in M Henssler (ed), BeckOGK, Aktiengesetz (15 January 2020); HC Ihrig, ‘Reformbedarf beim Haftungstatbestand des § 93 AktG’ (2004) 43 WM 2098, 2105.

92 KJ Hopt and M Roth, ‘§ 93 marginal number 80’ in H Hirte, P O Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); for a different view, see G Spindler, ‘§ 93 marginal number 51’ in W Goette and M Habersack (eds), ‘Münchener Kommentar zum AktG’ (5th ed. 2019).

93 Lieder, ‘Entscheidungen’ (Footnote n 76) 532; too indiscriminately negative, however, G Spindler, ‘§ 93 marginal number 51’ in W Goette and M Habersack (eds) ‘Münchener Kommentar zum AktG’ (5th ed. 2019); H Hamann, ‘Reflektierte Optimierung oder bloße Intuition?’ (2012) ZGR 817, 825 et seq.

94 T Bürgers, ‘§ 93 marginal number 11’ in T Bürgers and T Körber, AktG (4th ed. 2017); B Dauner-Lieb, ‘§ 93 AktG marginal number 20’ in M Henssler and L Strohn (eds), Gesellschaftsrecht (5th ed. 2021); M Graumann, ‘Der Entscheidungsbegriff in § 93 Abs 1 Satz 2 AktG’ (2011) ZGR 293, 296; for a different view, see KJ Hopt and M Roth, ‘§ 93 marginal number 80’ in H Hirte, P O Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015).

95 Cf. KJ Hopt and M Roth, ‘§ 93 marginal number 80’ in H Hirte, P O Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015).

96 J Koch, ‘§ 93 marginal numbers 16, 22’ in U Hüffer and J Koch (eds), Aktiengesetz (14th ed. 2020); G Krieger and V Sailer-Coceani, ‘§ 93 marginal number 41’ in K Schmidt and M Lutter (eds), AktG (4th ed. 2020); M Lutter, ‘Die Business Judgment Rule und ihre praktische Anwendung’ (2007) 18 ZIP 841, 847; Lieder, ‘Entscheidungen’ (Footnote n 76) 533.

97 Deutscher Bundestag, ‘Begründung zum Regierungsentwurf, BT-Drucks. 15/5092’ (2005) 11; BGHZ 135, 244 (253) (Bundesgerichtshof II ZR 175/95); B Dauner-Lieb, ‘§ 93 AktG marginal number 24’ in M Henssler and L Strohn (eds), Gesellschaftsrecht (5th ed. 2021); KJ Hopt and M Roth, ‘§ 93 marginal number 90’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); G Spindler, ‘§ 93 marginal number 69’ in W Goette and M Habersack (eds), ‘Münchener Kommentar zum AktG’ (5th ed. 2019); S Harbarth, ‘Unternehmerisches Ermessen des Vorstands im Interessenkonflikt’ in B Erle and others (eds), ‘Festschrift für Peter Hommelhoff’ (2012) 323, 327; criticising this G Krieger and V Sailer-Coceani ‘§ 93 marginal number 19’ in K Schmidt and M Lutter (eds), AktG (4th ed. 2020).

98 Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1135.

99 Cf. further Noack, ‘Organisationspflichten’ (Footnote n 10) 123.

100 Cf. Enriques and Zetzsche, ‘Corporate Technologies’ (Footnote n 25) 42.

101 See supra Section III 2.

102 See supra Section III 3.

103 Fleischer, ‘Aufsichtsrat’ (Footnote n 17) 121.

104 For details, see I Erel and others, ‘Selecting Directors Using Machine Learning’ (NBER, March 2018) www.nber.org/papers/w24435.

105 Cf. Fleischer, ‘Algorithmen’ (Footnote n 17) 121; Noack, ‘Aufsichtsrat’ (Footnote n 10) 507.

106 Cf. Noack, ‘Aufsichtsrat’ (Footnote n 10) 502.

107 Cf. Noack, ‘Aufsichtsrat’ (Footnote n 10) 502; for a different view, see Strohn, ‘Rolle’ (Footnote n 10) 374.

108 S Hambloch-Gesinn and FJ Gesinn, ‘§ 111 marginal number 46’ in W Hölters (ed), AktG (3rd ed. 2017); M Habersack, ‘§ 111 marginal number 74’ in W Goette and M Habersack (eds), ‘Münchener Kommentar zum AktG’ (5th ed. 2019); HC Grigoleit and L Tomasic, ‘§ 111 marginal number 49’ in HC Grigoleit (ed), AktG (2nd ed. 2020).

109 Noack, ‘Organisationspflichten’ (Footnote n 10) 140 et seq.

110 HJ Mertens and A Cahn, ‘§ 111 marginal number 52’ in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010); A Cahn, ‘Aufsichtsrat und Business Judgment Rule’ (2013) WM 1293, 1299 (hereafter Cahn, ‘Aufsichtsrat’); M Hoffmann-Becking, ‘Das Recht des Aufsichtsrats zur Prüfung durch Sachverständige nach § 111 Abs 2 Satz 2 AktG’ (2011) ZGR 136, 146 et seq.; M Winter, ‘Die Verantwortlichkeit des Aufsichtsrats für “Corporate Compliance”’ in P Kindler and others (eds), Festschrift für Uwe Hüffer zum 70. Geburtstag (2010) 1103, 1110 et seq.

111 KJ Hopt and M Roth, ‘§ 111 marginal number 410’ in H Hirte, PO Mülbert, and M Roth (eds), Großkommentar zum AktG (5th ed. 2015); W Zöllner, ‘Aktienrechtliche Binnenkommunikation im Unternehmen’ in U Noack and G Spindler (eds), Unternehmensrecht und Internet (2001) 69, 86.

112 HJ Mertens and A Cahn, ‘§ 111 marginal number 52’ in W Zöllner and U Noack (eds), Kölner Kommentar zum AktG (3rd ed. 2010); J Koch, ‘§ 111 marginal number 21’ in U Hüffer and J Koch (eds), Aktiengesetz (14th ed. 2020); M Lutter, G Krieger, and D Verse, ‘Rechte und Pflichten des Aufsichtsrats’ (7th ed. 2020) marginal number 72; Cahn, ‘Aufsichtsrat’ (Footnote n 110) 1299; G Spindler, ‘Von der Früherkennung von Risiken zum umfassenden Risikomanagement – zum Wandel des § 91 AktG unter europäischem Einfluss’ in P Kindler and others (eds), Festschrift für Uwe Hüffer zum 70. Geburtstag (2010) 985, 997 et seq.

113 For details, see Lieder, ‘Entscheidungen’ (Footnote n 76) 557 et seq., 560, 563.

114 Cf. Wagner ‘Legal Tech 2’ (Footnote n 41) 1105; see furthermore Strohn, ‘Rolle’ (Footnote n 10) 375.

115 See supra Section III 2(e).

116 Leaning in that direction Armour and Eidenmüller, ‘Kapitalgesellschaften’ (Footnote n 9) 169 et seq.; cf. further, in general, V Boehme-Neßler, ‘Die Macht der Algorithmen und die Ohnmacht des Rechts: Wie die Digitalisierung das Recht relativiert’ (2017) NJW 3031 et seq.

117 Cf. Enriques and Zetzsche, ‘Corporate Technologies’ (Footnote n 25) 42.

118 More extensive Möslein, ‘Digitalisierung’ (Footnote n 17) 212; Weber, Kiefner, and Jobst, ‘Unternehmensführung’ (Footnote n 17) 1136; restrictive like here Armour and Eidenmüller, ‘Kapitalgesellschaften’ (Footnote n 9) 189; Enriques and Zetzsche, ‘Corporate Technologies’ (Footnote n 25) 47 et seq.; Noack, ‘Organisationspflichten’ (Footnote n 10) 142.

119 Cf. Enriques and Zetzsche, ‘Corporate Technologies’ (Footnote n 25) 50 et seq.; Strohn, ‘Rolle’ (Footnote n 10) 377.

120 NYSE, ‘Listed Company Manual’ Section 3, 303A.12 (NYSE, 25 November 2009) https://nyse.wolterskluwer.cloud/listed-company-manual.

121 FCA, ‘Listing Rules – FCA Handbook’ LR 9.8.6.R (5) (FCA, January 2021) www.handbook.fca.org.uk/handbook/LR.pdf.

122 AktG, section 161(1)(1).

123 For inclusion in the code cf. also Noack, ‘Organisationspflichten’ (Footnote n 10) 113, 142.

124 For precautionary compliance with the guidelines by the Supervisory Board, see Möslein, ‘Digitalisierung im Aufsichtsrat: Überwachungsaufgaben bei Einsatz künstlicher Intelligenz’ (2020) Der Aufsichtsrat 2(3).

125 Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee of the Regions, ‘Building Trust in Human-Centric Artificial Intelligence’(EUR-Lex, 8 April 2019) https://eur-lex.europa.eu/legal-content/GA/TXT/?uri=CELEX%3A52019DC0168.

126 OECD AI Policy Observatory, OECD Principles on Artificial Intelligence (OECD.AI, 2020) https://oecd.ai/en/ai-principles.

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×