4 minute read

Guiding the Board: The GC’s Role in Navigating AI Strategy and Compliance

Next Article
Editor's Desk

Editor's Desk

By NOGA ROSENTHAL

Given the increasingly widespread use of generative artificial intelligence in corporate environments, it’s likely that most general counsel have already intervened and established company policies to regulate AI use. The next crucial task is ensuring board members are well-informed about AI initiatives, which is vital given the board’s fiduciary duties. To manage this effectively, general counsel should adopt a clear, stepby-step approach, guiding boards to understand AI’s complexities and balance its benefits against potential risks.

Education Is Key

General counsel should ensure board members have a basic grasp of AI, including machine learning, deep learning, and “good” data sets. This foundational knowledge enables the board to provide informed guidance on AI governance. Real-world demonstrations of AI tools can further reinforce this understanding, such as showcasing a chatbot trained on historical customer service answers.

Consider Adding New Expertise

Advocating for the inclusion of new board members, advisory groups or committees with AI expertise may be necessary, especially for companies focusing heavily on AI technology. Backgrounds in privacy, security, or technology can be valuable additions to the board.

Presenting Advantages And Risks

General counsel should explain how AI benefits the organization, highlighting increased productivity, cost reduction, and innovation. They should link AI initiatives to the company’s strategic goals, demonstrating its positive impact on metrics like customer service and experience. For example, the advantages of a chatbot are that it can respond to a high volume of questions at any time using consistent responses. The chatbot can also gather information from these chats to help the company improve its products and services. General counsel can then chart how AI is tied to a company’s strategic goals such as improved customer service and experience. This presentation will likely include outlining the company’s investments in AI tools, staffing, and training.

General counsel should next present a detailed overview of the risks associated with using AI, as the board of directors is responsible for overseeing these corporate risks. Covering data privacy and security matters is crucial because AI systems often handle sensitive information. Ethical concerns and their impact on customers and employees are also significant. For instance, one risk is customers becoming upset when realizing they are chatting with a bot rather than a human. This can affect customer trust and satisfaction.

Additionally, companies must comply with a regulatory landscape that is rapidly evolving, posing a challenge in ensuring ongoing legal compliance. General counsel should help their boards understand the risks tied to an evolving —and inconsistent— regulatory landscape. General counsel also understand that companies must uphold ethical and social standards when using or developing AI in regions or industries where regulations have yet to catch up with the technology.

Devise Mitigation Strategies

After spotting these risks, general counsel should outline how the company plans to manage or mitigate them. General counsel must ensure that the proposed use of AI aligns with the company’s documented core values such as integrity, diversity and inclusion, and transparency. This can include making sure that their organization is transparent with their customers and lets them know they are talking to a chatbot and not a real person.

Referring to existing company policies, such as procurement procedures for new vendors, can help mitigate risks as well. For example, most companies can point to procurement policies and procedures that require a security or privacy review of a new vendor and how these vendors process company data.

Moreover, general counsel should commit to regularly updating the board on the AI strategy. This involves providing ongoing reports on compliance efforts, strategic adaptations of AI, and monitoring the effectiveness and impact of AI within the organization. Such updates will ensure that the board remains well-informed and can make decisions that align with both the company’s goals and ethical standards, thereby effectively overseeing the responsible use of AI technologies.

Examine Costs

Collaborating with management, general counsel should review the costs of AI tools or development with the board. This includes considering the impact on the workforce or hiring needs. This evaluation ensures a comprehensive understanding of the financial implications of AI initiatives.

Proactive Approach And Ongoing Coordination

General counsel should help management and boards take a proactive approach to AI tools, anticipating the impact of changing regulatory landscapes and fast-paced technology changes. The board should rely on general counsel to coordinate responses to incoming questions and concerns, identify and address new challenges, and ensure ongoing compliance with core company values.

Noga Rosenthal is a seasoned privacy compliance and data ethics professional specializing in the technology sector. She has developed and managed global privacy programs for companies such as Xaxis, Epsilon and Ampersand. Rosenthal serves as a trustee for the Practicing Law Institute and an adjunct professor at Fordham Law School. LinkedIn profile
This article is from: