ACC convened members to have a discussion about how they and their organizations currently use AI. Their tips are summarized below.
How in-house counsel use AI
- Some draft contracts in Word document and use AI tools to help improve the wording.
- Some use AI tools in collaborative communication platforms to summarize conversations. Tip: Make attendees aware that the discussion will be transcribed.
- Some use it to help draft or automate email responses.
- Some use AI to turn a Word document into presentation slides.
- Some use AI to gather insight from data in spreadsheets.
- Some use it to break down and summarize a complex legal analysis into more business-friendly explanations, using a closed AI system (internal to their organization).
AI ethics and training employees
- Some departments create an AI ethics playbook, involving stakeholder from Legal, Privacy, HR, Communication and business teams.
- Consider including guardrails and guideposts for using AI – dos and don’t.
- Consider outlining step by step approach for vetting vendors and tools.
- Review your AI playbook periodically. Some review it twice annually.
- It’s worth repeating to business teams that they shouldn’t put confidential information in AI tools. Don’t assume that they understand that.
- Even at companies that ban the use of AI tools, it is possible that employees are using AI tools on their personal time or devices and enter the output in company projects.
- If you have your own LLM model, educate the business teams to correct the tool when it gives a wrong answer, so the system can learn not to repeat the mistake.
- Remind the business team that your organization (“we”) are still responsible for what the tool is doing or generating, whether it is a customer service chatbot or other tools.
AI governance
- Is your company buying and selling AI tools? That impacts your policies.
- Start by putting together a small working group including business people, HR, Communications, Compliance, and Legal.
- Ask yourself and the group: what issues do you see with AI and our organization? What questions do you want to answer for the business teams?
- Once you roll out your official AI policy, consider asking employees to acknowledge that they have received the policy.
- Keep educating people regarding the policy.
- There have to be human checks and balances to monitor your organization’s AI tools and the use of AI tools.
Vetting AI tools and vendors
- Understand your business partners’ risk tolerance.
- Understand what the vendor will be doing with your data.
- Understand what data you will be sharing in that tool.
- Understand the limitation of liability clauses, what kind of indemnity the vendor will be granting, and how long the contract will be.
- If you purchase an AI tool, consider the difference between a free version and an enterprise version. The enterprise version costs more money, but also often includes safeguards regarding confidentiality of the data.
- Challenge the business to ensure that they’ve done their due diligence on the vendor – is it a brand new vendor with very limited assets, or is it a long established technology company?
- Look at your current trusted vendors with which you have an established relationship – you may have more leverage to negotiate with them when adding an AI feature.
- Ask yourself whether the AI vendor or tool will create any issues with respect to the laws and regulations that apply to your organization.
Facilitating the adoption of AI tools
- Understand what the objectives of the business are in using the AI tool.
- Ask what the ROI will be on the business using the AI product.
- Consider the potential need to train the business regarding the use of the new tool.
- Consider needs to adjust your policies in light of the new tool.
- Build trust with the team. This makes it easier to speak with them about difficult topics – easier if they see you as a partner rather than as the “deal police”. Make sure they feel like you are on the same side, part of the same team.
- Try to explain to them using analogies. Avoid using legal jargon.
Recent issues or still on the radar
- AI hallucinations are still a concern – how to make the tools that you use better to prevent these.
- Challenges in disclosing that AI is being used are arising. A prominent retail brand was recently targeted by a lawsuit relating to the use of a third-party AI tool to analyze callers’ words and how the caller is feeling.
- Bias concerns are on the radar – ensure that the responses that the tool provides are not discriminatory.
- Get familiar with the EU AI Act.
- Cybersecurity threats intersect with AI. Check out the AI and Cybersecurity Checklist in the Cybersecurity Toolkit for In-house Lawyers.
- Learn more with the ACC AI resource collection: Artificial Intelligence Insights | Association of Corporate Counsel (ACC).
Disclaimer: The information in any resource in this website should not be construed as legal advice or as a legal opinion on specific facts, and should not be considered representing the views of its authors, its sponsors, and/or ACC. These resources are not intended as a definitive statement on the subject addressed. Rather, they are intended to serve as a tool providing practical guidance and references for the busy in-house practitioner and other readers.