Banner artwork by Collagery / Shutterstock.com
Your robot co-counsel is already here. Can you keep it in line?
Today, AI offers a host of solutions for the modern legal department. Large language models, chatbots, and agents may assist in document review and analysis, contract lifecycle management (CLM), legal research and writing, litigation support and analytics, automated e-discovery, and other clerical support like scheduling, billing, and time tracking.
Under the American Bar Association’s Model Rule 5.3(b), a lawyer having direct supervisory authority over a nonlawyer shall make reasonable efforts to ensure that the person’s conduct is compatible with the professional obligations of the lawyer. And Rule 5.4(c) states a lawyer may not substitute a nonlawyer’s judgement for their own.
These rules were born out of the growing complexity of the legal practice in the 1980s. As the business of law grew, increased reliance on paralegals and other staff raised concerns about ethics, professional accountability, and protecting the public. These same questions drive today’s conversation about artificial intelligence.
While AI is new, the rules that govern lawyers’ use of it are not. Established rules that apply to humans will apply to the use of AI. For in-house counsel, this means supervising your department and business units’ use — and misuse — of AI legal assistance.
Imagine this scenario: If your business team circumvents legal to draft and negotiate a contract provided by an AI legal assistant, have you breached your duties towards Rule 5.3 and 5.4 and placed your license in jeopardy?
The ABA provides limited guidance on ABA Resolution 112. This resolution addresses emerging legal and ethical issues related to the use of AI, highlighting issues around bias, ethics, and oversight of systems. When it comes to AI technology and the duty of supervision, Resolution 112 states that a lawyer must know where to draw the line. But also, a lawyer must not underutilize solutions. What that means — go figure.
New! The ACC AI Center of Excellence for In-house Counsel is a brand new resource, designed specifically for in-house counsel, to help legal departments navigate AI with clarity and confidence. The AI Center of Excellence will offer:
- Curated tools and insights
- Peer learning from real-world use cases
- Ethics, risk and governance frameworks, and guidance tailored for Legal
- Leadership strategies for the AI era
Enter in-house counsel using artificial intelligence in their day-to-day. Our collective experience is the best resource to provide a path through uncharted waters. Here are some ways ACC members are supervising their organization’s use of legal AI, and how they ensure it’s a licensed attorney’s judgement that makes the final call.
Draft AI policies
First, in-house teams are drafting AI policies for companies to adopt across the organization. Policies meet key supervisory duties by spelling out what colleagues in legal and other units can and cannot do when it comes to generative AI or AI assistants and what the consequences are for bad conduct.
Policies should:
- Flags what tools are covered
- Provides dos and don’ts
- Clarify discipline review processes and penalties for misconduct
Define key concepts
Second, lawyers should ensure that policies and guidance define key concepts like privilege, trade secret, or copyright, ensuring all stakeholders understand what they mean and how such information may or may not be used in relationship to AI. Providing definitions furthers the duty of supervision by establishing clear, shared understandings that guide nonlawyers and AI systems to operate within legal and ethical boundaries under the lawyer’s direction.
A good definition:
- Speaks in terms all stakeholders understand
- Is comprehensive without being exhausting
- Invites all staff to learn more
Create clear workflows
Third, good AI policies provide clear workflows and chains-of-command, ensuring proper review of outputs and final decisions are taken by the correct principals. When it comes to legal tech, AI policy workflows are holistically integrated into the office of general counsel’s existing processes.
The big picture contains:
- Unambiguous steps in the road to review and approval
- Identified decision makers
- The perspective of the entire organization
Keep in mind
And other tips to keep in mind on your AI journey:
- Spell out major questions and remove ambiguity wherever you can.
- Managing access to your legal tech is key. Control access to software capable of independent legal research and writing.
- Understand your tools. Technical know-how will mean success.
- Training for all stakeholders is an essential component.
- Stay up to date — from the latest software offerings to ethics thought leaders.
A machine cannot be held accountable; it is up to you to decide who is. Build strong structures to drive your group into the future. The ACC AI Center of Excellence for In-house Counsel is your go-to resource on this venture.
Disclaimer: The information in any resource in this website should not be construed as legal advice or as a legal opinion on specific facts, and should not be considered representing the views of its authors, its sponsors, and/or ACC. These resources are not intended as a definitive statement on the subject addressed. Rather, they are intended to serve as a tool providing practical guidance and references for the busy in-house practitioner and other readers.