If a legal proceeding were to occur to determine whether society should welcome or oppose artificial intelligence (AI), it’s probable that the jury’s decision would be inconclusive.
It appears that there’s no clear consensus on whether the advantages of artificial intelligence, such as automating written tasks and quickly sifting through vast amounts of data, outweigh the drawbacks associated with biased data, accuracy issues, and a lack of accountability.
For the legal profession, AI poses both a potential threat and an opportunity. A report from the UK’s Law Society in 2021 suggested that it could lead to a significant reduction in human jobs. A recent study by the universities of Pennsylvania, New York, and Princeton identified the legal sector as the industry most likely to be affected by AI.
However, AI can also play a crucial role in legal research and case preparation, even though there have been cases where things went terribly wrong. A New York lawyer, Steven Schwartz, faced his own legal hearing when he used an AI system, ChatGPT, to research precedents for a personal injury case against an airline. Shockingly, the AI generated six out of seven completely fabricated cases.
This incident may have made many law firms hesitant to adopt such systems. Nonetheless, Ben Allgrove, the Chief Innovation Officer at the international law firm Baker McKenzie, offers a different perspective. He believes it’s not solely a technology issue but a matter of the lawyer’s professionalism and ethics. In this case, Mr. Schwartz’s lack of professionalism and ethics overshadowed the tool’s suitability for use.
Several law firms are already integrating AI into their operations.
Baker McKenzie has been closely monitoring AI developments since 2017, establishing a team comprising lawyers, data scientists, and data engineers to evaluate emerging AI systems. Ben Allgrove, the Chief Innovation Officer, envisions that a significant portion of AI implementation at the firm will involve using AI-powered versions of established legal software providers, such as LexisNexis and Microsoft’s 365 Solution for Legal.
LexisNexis, for instance, introduced its AI platform in May, capable of responding to legal queries, generating documents, and summarizing legal issues. Microsoft’s AI tool, Copilot, is set to launch as an extra-cost add-on for commercial customers, offering AI capabilities.
These AI tools fall into the category of generative AI, which can create text, images, and music based on the data they were trained with. However, it’s worth noting that premium versions of such tools are currently costly. For instance, acquiring Microsoft’s Copilot alone would significantly increase technology expenses.
Alternatively, law firms can explore more affordable options by leveraging AI systems not explicitly designed for the legal market, such as Google’s Bard, Meta’s Llama, and OpenAI’s ChatGPT. Firms can adapt these platforms for their specific legal needs.
Baker McKenzie is actively testing various AI models and seeking to assess their performance. This is crucial, as all AI systems are prone to errors. One example of a legal software system using AI is RobinAI, which employs an AI co-pilot to expedite contract drafting and querying processes for both in-house legal teams in large organizations and individual users.
RobinAI primarily utilizes an AI system developed by Anthropic, a company founded by a former vice president of research at OpenAI and backed by Google. In addition to third-party models, RobinAI has created its own AI models trained in the intricacies of contract law. Contracts used with the system are uploaded, labeled, and used as learning materials, enabling the firm to build a substantial database of contracts, which, according to Karolina Lukoszova, co-head of legal and product at RobinAI, is integral to the future use of AI in the legal profession.
Karolina Lukoszova’s company, RobinAI, employs a combination of externally acquired AI tools and its proprietary AI solutions.
According to Karolina Lukoszova, companies will need to train their own smaller AI models using their in-house data. This approach can yield better and more secure results.
To ensure the accuracy of information, RobinAI employs a team of human lawyers who work in collaboration with AI.
Alex Monaco, an employment lawyer who operates his own solicitor practice and the tech firm Grapple, is enthusiastic about AI’s potential to make the legal profession more accessible to the public. Grapple was designed to provide the public with what Mr. Monaco calls “an ontology of employment law” and offers guidance on various workplace issues, from bullying and harassment to redundancy. It can generate legal documents and provide case summaries.
Mr. Monaco believes that AI has the potential to democratize the legal field. He points out that approximately 95% of the inquiries he receives are from individuals who cannot afford the services of lawyers. Thanks to freely available AI tools, people can now build their legal cases. With an internet connection, anyone can use tools like Bard or ChatGPT to assist in composing legal documents. While it may not be as effective as a lawyer-drafted document, it is a cost-free option.
Mr. Monaco emphasizes that AI is not replacing humans or lawyers. Instead, it enhances people’s understanding and implementation of their legal rights. In a world where AI is widely used, this can be highly significant.
Companies and corporations are increasingly utilizing AI in various aspects of employment, including hiring, firing, CV profiling, restructuring, and large-scale layoffs. This trend is raising concerns that AI is being leveraged to the detriment of the average employee.
While AI adoption in the legal field is still in its early stages, some systems are already facing legal challenges. DoNotPay, known as the world’s first robot lawyer for assisting with tasks like contesting parking fines and citizen cases using AI, has encountered a series of lawsuits. The most recent lawsuit accuses the firm of practicing law without a license.
Additionally, following the case involving Steven Schwartz, several senior judges in the United States now require lawyers to disclose whether AI was used in their court filings. However, Alex Monaco believes that defining and enforcing these requirements will be challenging, as AI is integrated into various everyday activities, including legal research conducted through tools like Google’s Bard.