It is hard to believe that it was only two years ago when ChatGPT took global markets by storm in November 2022, literally changing how we work overnight. There is no question that artificial intelligence (AI) and generative AI (GenAI) are reshaping and potentially disrupting entire industries. The opportunities for companies are extreme, particularly around corporate strategy and innovation.
This article explores what board directors are (or should) be doing to ensure they provide the appropriate oversight over the deployment of this transformative role of GenAI. The spark for this article came at the intersection of two simultaneous endeavors – serving as an advisor to an exciting Gen AI start-up while also pursuing and preparing for a board of director role. I became very curious about how boards are addressing the risks and opportunities of this new disruptive technology, and I found the results quite interesting.
For Most Boards, GenAI Has Not Made it to the Top of the Agenda
A recent Deloitte survey* finds that, to date, board-level engagement with AI has been quite limited across industries and geographies. Per this survey, “nearly half (45%) of respondents say AI hasn’t yet made it onto their board’s agenda at all”. Interestingly, only 14% of respondents say their board discusses AI at every meeting.
These findings are consistent with that of the National Association of Corporate Directors (NACD)* who, in conjunction with the Data Trust Alliance, found that while 95% of directors believe that AI will impact their business over the next year, only 28% say it is a regular topic on the agenda.
The Risks are Increasing
The emergence of GenAI is complex and evolving every day and, unlike other technologies it is embedded at all levels of the organization and across all employees, regardless of technology competence. Accordingly, the potential for misuse and unintended consequences is heightened, and companies are increasingly exposed to reputational, financial, operational, and legal risks from deploying AI systems and strategies.
These risks are getting the attention of institutional investors, regulators, lawmakers and other stakeholders expecting increased disclosures and transparency on AI strategies and risks. Based on a recent study of S&P 500 Form 10-K filings, the number of companies citing AI as a risk factor increased from 49 to 281 in 2022 compared to 2024. This increased scrutiny will continue to grow and evolve.
Implications for the Board of Directors
Per the Harvard Law School Forum on Corporate Governance*, Board responsibility requires oversight of the exercise of authority delegated to management, including oversight of legal compliance, ethics, and enterprise risk management. The Board’s oversight responsibilities include the company’s use of AI, which requires the same fiduciary diligence and focuses on internal controls and policies.
Where to Begin: Questions to Ask
To ensure that Boards are knowledgeable and fulfilling their fiduciary duties in relation to AI, the following is a brief checklist of key questions and topics Boards should begin to more deeply understand or identify gaps that must be addressed:
Do you have the expertise?
Boards must ensure they have the knowledge to ask the right questions concerning AI strategies. Per a Deloitte 2024 Survey*, nearly 80% of respondents indicated that their boards have limited to no knowledge or experience with AI. A base level of knowledge should be attained through specialized training, access to an AI/Gen AI expert, or the addition of a new board member or special committee with the requisite expertise. There is no one-size-fits-all, but there must be an appropriate level of understanding within the Board to ask the right questions of management commensurate with the relevance of AI transformation unique to the companies served.
Do you understand the implications of AI/GenAI for the company and its industry?
The board should understand the bigger picture of strategic opportunities unique to the company and the potential for industry disruption. They must know how AI is being used throughout the company’s processes, within its products, and in third-party products used by the company. Equally important is understanding if management has the needed resources and expertise to execute its AI strategy, how they will measure success, and who is responsible for it.
Do you know the risks?
With the requisite level of expertise, the Board must fully understand the risks based on the company’s unique profile and industry created by its AI strategy. Because these strategies are highly dependent on data, they may not function as intended, creating adverse outcomes that include false or misleading content, discriminatory practices, violation of data rights and intellectual property, privacy issues, and cyber security breaches, among other risks.
Other risks may include the potential impact on customers and other external and internal stakeholders such as suppliers and employees (including future talent needs). Given the evolving nature and fast pace of technological and regulatory changes, Boards must also ensure an effective, ongoing process to identify and address new risks or compliance issues and potential disruption in the Company’s operations and industry overall.
Do you understand your role in oversight responsibility and Compliance?
According to an Ernst & Young report* on ways boards can support the effective use of AI, AI policy oversight should be a key board mandate, and boards must confirm that management teams incorporate responsible AI at the start of the deployment journey.
Boards must ask specific questions about the company’s AI governance framework. These questions include its cadence of its review and monitoring. Boards should establish regular, systematic updates from management on AI opportunities and risks. In turn, management should act on any known issues. With the ever evolving legal, regulatory, and ethical obligations, the Board should ensure compliance.
Conclusion
A Board’s fiduciary and oversight role extends to the implications of AI. Current surveys indicate that Boards may be falling behind on this responsibility. Boards must, therefore, keep up to date on technological developments and the deployment of AI systems at the companies they serve. They must keep pace with the evolving stakeholder expectations and AI regulatory landscape.
*SOURCES
Deloitte Global Boardroom Program AI/gen AI board governance survey, June 2024 (Deloitte Survey)
“AI and Board Governance” prepared by the NACD in partnership with Data & Trust Alliance, 2023 ( National Association of Corporate Directors (NACD))
“AI and the Role of the Board of Directors” Harvard Law School Forum on Corporate Governance, October 2023 ( Harvard Law School Forum on Corporate Governance)
“Four ways boards can support the effective use of AI” prepared by Ernst & Young, October 2024 ( Ernst & Young report)
Comments