The Management and the Advice of (Un)Explainable AI.
In: European Company & Financial Law Review, Jg. 20 (2023-12-15), Heft 5/6, S. 794-821
academicJournal
Zugriff:
794 The article addresses the issue that is widely discussed in Germany and other jurisdictions: can the management of a company use AI applications in its decision-making process without violating its fiduciary duties? The lack of transparency in conventional AI applications conflicts with the fiduciary duty to check the plausibility of external expert advice (in Germany known as the ISION principles). This tension can be partly resolved by using explainable AI (XAI). In this work, we review the basic principles of machine learning and XAI and discuss them in the legal context. [ABSTRACT FROM AUTHOR]
Copyright of European Company & Financial Law Review is the property of De Gruyter and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Titel: |
The Management and the Advice of (Un)Explainable AI.
|
---|---|
Autor/in / Beteiligte Person: | Dubovitskaya, Elena ; Buchholz, Annika |
Zeitschrift: | European Company & Financial Law Review, Jg. 20 (2023-12-15), Heft 5/6, S. 794-821 |
Veröffentlichung: | 2023 |
Medientyp: | academicJournal |
ISSN: | 1613-2548 (print) |
DOI: | 10.1515/ecfr-2023-0033 |
Schlagwort: |
|
Sonstiges: |
|