
April 29, 2025
By Rich Craven
With many AI-driven models now available to assist with admin tasks and produce written works in minutes, businesses are turning to these tools to generate their own legal documents. This includes drafting agreements, policies, and terms and conditions, with the aim of to reducing reliance on external legal services and minimising costs.
However, this approach is not without risk.
Our commercial law expert, Rich Craven, explores the hidden dangers of using AI to generate legal contracts, and why seeking professional input remains crucial.
The appeal: why businesses turn to AI
AI tools promise efficiency, speed, and cost savings. For startups and small businesses, the idea of generating a contract at the click of a button, sometimes at no cost, is understandably appealing. These tools can generate basic templates, automate boilerplate clauses, and handle formatting tasks with unprecedented speed.
However, this superficial efficiency often masks a concerning truth: AI-generated contracts and legal research, may lack the legal robustness, accuracy and enforceability required in real-world commercial settings.
The risks:
AI systems are only as good as the data they’re trained on. Even the most advanced models can make basic and critical errors. This is a phenomenon that has become known as a “hallucination”, where seemingly plausible but incorrect, or even fabricated information, is presented as fact. These errors can result in contracts that omit key provisions, mis-state legal obligations and facts (or make them up entirely), or include unenforceable terms.
For example, in the context of an insolvency matter, Darwin Gray received a claim from someone alleging that they had rights over some property owned by a client. While they quoted extensive case law in support of their claim, it did not take our team long to discover that the purported case law did not exist. Furthermore, the legal basis on which the claim relied was improperly interpreted.
Such false confidence from AI tools in a commercial context could result in the drafting of clauses that are unenforceable, and not based on real life precedents. These aren’t minor slip-ups; they’re liabilities waiting to be triggered.
AI tools are also often geared towards a US audience, and often therefore generate US legal concepts and termination into its work. This would not be relevant in the laws of England and Wales and may result in documents being uncertain, and ultimately could make the agreement null and void. Furthermore, there is a real risk that anything generated will not be compliant with relevant legislation (for example, consumer protection rights).
AI can be used to understand the strategic interests or unique needs of your business, but it is not without its faults. A lawyer will tailor contracts to reflect your commercial background, manage specific and identified risks, and ensure the wellbeing of your business is protected.
An AI-generated agreement will be standardised without significant revisions. While it may meet generic legal requirements, it may overlook critical business-specific terms, such as risk-tailored indemnities, data processing obligations, or intellectual property rights.
AI can provide a basic structure, but it may not fit in with your commercial goals or pick up on the nuances of your business.
There’s a growing risk that businesses (particularly those without in-house legal teams) may rely too heavily on AI, prioritising efficiency over expertise. Even technically correct outputs can be dangerous if misapplied. Many tools require carefully constructed prompts, and without legal knowledge, it’s easy to input the wrong information or miss crucial details entirely.
A lack of understanding from both the user and the computer is a recipe for misinformed disaster.
AI tools often operate as a black box and their internal workings are not fully understood by most users. When a clause is generated, there is no way to verify its source, or assess its legal accuracy. This lack of transparency can make it difficult to validate contracts. A major concern where there is so much at stake and the margins for error are razor-thin.
Additionally, many AI tool providers absolve themselves of any liability for errors, leaving businesses to bear the consequences alone.
Most businesses must also navigate regulatory and legislative obligations. An AI generated agreement might not include these provisions either at all, or in compliance with the relevant legislation.
Businesses should also be mindful of inputting any sensitive commercial information into public AI systems. In addition to the potential breach of confidentiality or privacy obligations, it is unknown exactly how or where this data is being used and stored, a significant data protection risk. It might also end up feeding your sensitive business information into their AI training systems to be used again.
Another commercial issue is that insurers may exclude coverage for claims arising from AI errors. This means an invalid AI-drafted agreement could leave your business exposed with no safety net.
Final thoughts: proceed with caution
While AI may be useful in your business, you should proceed with extreme caution before using it in place of legal support. The stakes in legal drafting, particularly in a contractual context, are too high for shortcuts. Terms and conditions are not just a formality; they the legal foundation upon which a business operates.
For expert guidance in drafting or review of your commercial agreements, contact our commercial team using the contact form or on 02920 829 100 to see how we can help.