Banner artwork by MaksEvs / Shutterstock.com
The rise of generative AI technologies presents new opportunities for general counsel, with one GC recently sharing an intriguing insight, saying, “The legal department is a vital connection point across a company’s fragmented data landscape. Because of that, a GC’s core role now requires a deep understanding of data across various domains.”
Delving deeper, she explained, “On the legal team, privacy experts, eDiscovery specialists, and compliance officers each have a unique perspective on the organization’s data story. But no individual sees the complete picture.”
She added, “Business stakeholders such as the chief information officer, chief security officer, IT team members, and others manage specific risks. They each fight individual fires without considering the full data landscape.”
The legal department is the central connector, shepherding everyone’s efforts to use and manage data wisely.
In this pivotal position, GCs must possess data literacy skills to understand their company’s complete data story and the associated risks and benefits.
Here are a few key lessons:
Understand data inputs and assess bias.
The data AI systems use significantly impact their output. Generative AI systems can inadvertently perpetuate biases in training data or AI models. To evaluate the ethical implications of using specific datasets or AI models, GCs must first:
- Understand how biases can arise from skewed or incomplete datasets.
- Be able to identify potential biases in data.
- Know how to ensure fairness in AI-based decision-making processes.
Success requires a firm grasp of fundamental data concepts, including data types, sources, quality, and reliability. With these skills, you can make informed decisions and work to ensure fairness in AI-generated outputs.
Comply with data privacy and protection laws and regulations.
Data protection laws such as the General Data Protection Regulation, the California Consumer Privacy Act, and others impose strict requirements on collecting, processing, and storing personal data. They also raise ethical concerns about consent and transparency.
Even gaining a foothold in AI’s ever-changing risk landscape requires understanding the collection, storage, processing, and transfer of sensitive data in LLMs and other AI systems.
Data-related risks continue to grow in complexity. Today, companies can customize AI tools using their data to train AI models for specific use cases. However, doing so can introduce new risks and amplify others, such as unauthorized access and data loss. As unknown risks and new laws and regulations arise, GCs will help companies implement best practices for data protection and risk mitigation against data breaches and other types of cyberattacks.
Guide IP, contracts, and licensing agreements.
In-house lawyers may need to identify potential intellectual property issues related to the datasets AI systems ingest and the outputs they generate, including fair use policies and copyright infringement.
Many in-house lawyers must evaluate vendor agreements that involve data considerations. As organizations collaborate through data sharing or the use of AI systems, GCs will need to review, negotiate, and draft contracts that address data ownership, licensing rights, confidentiality obligations, and liability considerations.
In most instances, data holds value and can be a source of competitive advantage. Data literacy skills are essential to ensure the data AI uses is appropriate, compliant, and not misused. For example, legal work can get complicated when AI uses data from multiple sources, such as public and proprietary databases, which may be subject to differing laws and regulations.
Industry-specific compliance with AI requirements.
Because GCs help shape organizations' legal strategy and direction, they stay up-to-date with many industry regulations and standards (e.g., HIPAA, FINRA, the SEC, etc.). As business grows increasingly AI- and data-driven, regulations and standards to govern AI systems will arise, particularly for healthcare, finance, and consumer protection.
Lawyers need sharp data literacy skills to assess the legality, ethical implications, and potential risks of using AI systems.
Whatever shape future regulations take, in-house legal teams must understand data governance and data-related processes to strategize, make informed decisions, mitigate legal challenges, and comply with industry-specific requirements.
Legal is a crucial connector of data across the enterprise.
The legal department is a crucial connector among business parties across a complex and fragmented data landscape. As a legal leader, you work at the intersection of many different perspectives, requiring that you not only understand, analyze, and use data but also monitor, protect, and manage data efficiently. This makes it imperative to prioritize data literacy as a core competency in the professional development of your legal teams.
By prioritizing and providing data literacy training, you empower your legal team to embrace new technologies, foster collaboration, and drive innovation. Your leadership can help your organization confidently emerge as a trailblazer in the AI era.
Disclaimer: The information in any resource in this website should not be construed as legal advice or as a legal opinion on specific facts, and should not be considered representing the views of its authors, its sponsors, and/or ACC. These resources are not intended as a definitive statement on the subject addressed. Rather, they are intended to serve as a tool providing practical guidance and references for the busy in-house practitioner and other readers. Information/opinions shared are personal and do not represent author’s current or previous employer.