Artificial Intelligence (“AI”) is no longer the “future of law”, rather it is becoming embedded in daily practice. Junior lawyers are already using it to generate first-draft pleadings, summarise discovery, review contracts, and streamline research. The novelty has worn off. The focus now is on managing the associated risks.
The biggest of those risks is hallucination. Ignoring it is not an option.
The Hallucination Problem
Unlike legal databases, AI models don’t “know” the law. They generate words that look statistically likely to be correct but can in fact be wrong. The danger is that the results may appear polished and convincing.
A growing concern is that AI systems are increasingly training from content generated by other AI tools. This creates a feedback loop where inaccuracies are recycled and amplified over time, making errors harder to detect and more likely to spread.
Real examples
- Fabricated cases: In a US case of Mata v Avianca1, lawyers filed submissions citing six authorities produced by ChatGPT. Every one of them was fake. The lawyers were sanctioned with fines, and the case became a global warning to the profession.
- Also recently in Australia, a Victorian lawyer (also a KC) acting for a minor filed submissions in a murder case including AI-generated errors of non-existent case citations purportedly from the Supreme Court. The errors caused a 24-hour delay in resolving the case.
- Misstating principles: An AI summary of Baumgartner v Baumgartner (1987) 164 CLR 137 incorrectly described the constructive trust principle as requiring “mutual intent” which is incorrect.
- Inventing statutes: Some AI tools have confidently cited provisions of the Fair Work Act 2009 (Cth) that simply do not exist, such as “s 112A.”
- Outdated law: AI can also cite repealed provisions, such as referencing the Property Law Act 1974 (Qld) instead of the Property Law Act 2023 (Qld).
None of these are harmless errors. Each has the potential to undermine a client’s case, erode credibility before a judge, and expose practitioners to negligence claims or professional sanctions.
Professional duties don’t shift
The presence of AI doesn’t change the solicitor’s professional obligations. Solicitors bear duties to the court and to their client, which remain paramount. If a pleading cites a hallucinated case, it is your client as well as your name and practising certificate at risk.
This means:
- Check everything: Always read the full judgment before relying on an authority.
- Maintain confidentiality: Rule 9 of the Australian Solicitor Conduct Rules requires practitioners to not disclose information that is confidential to a client. Do not upload sensitive client documents into unsecured platforms. Ask your supervisor to confirm whether there are certain platforms which are approved for use at your firm.
- Exercise judgement: AI can be a drafting tool, not a substitute for strategic legal advice. There may additional commercial considerations that are outside the scope of the instructions provided, and therefore outside the scope of the result provided by AI which makes the advice not appropriate.
- Ensure compliance with applicable Practice Directions. For instance, Queensland Supreme Court Practice Direction 5 of 2025 requires practitioners to identify the individual responsible for preparing submissions. This reflects the Court’s recognition of the growing use of AI in litigation and the potential harm that can result from inaccuracies.
Courts and regulators are already signalling their expectation that lawyers will use AI responsibly. Practitioners cannot later shift blame to the machines.
Why Junior Lawyers are most affected
The tasks most vulnerable to hallucinations are research, drafting, and collation. Such tasks are precisely those given to junior lawyers. Partners will expect you to spot the errors. Failing to pick up a fabricated case or a misstated principle will not only damage your credibility but also put the client and firm at risk.
Far from diminishing your role, this reality elevates it. Your value lies in your ability to filter, verify, and supervise the technology. AI may deliver a draft in seconds, but it takes a lawyer to turn that draft into something reliable and persuasive.
Hallucination Checklist
Before you rely on any AI-assisted output, run through this checklist:
- Case citations: Does the case exist, and does the ratio decidendi and judgment say what the AI claims?
- Statutory references: Is the section real, in force, and from the correct Act?
- Quotations: Are they accurate, in context, and not paraphrased in error?
- Jurisdiction: Is the authority from the right court, state, country?
- Dates and figures: Has AI swapped numbers, misstated timelines, or fabricated facts to fit the picture in error?
The Bottom Line – Mind over Machine
AI is no longer an optional extra in law, instead it is becoming a baseline expectation. Clients want the efficiency, and firms need the efficient speed. But hallucinations are the trap that will catch out the unwary.
In an environment where machines can draft almost anything, human judgement has never been more valuable. That is where junior lawyers prove their worth.
If there is one takeaway from this article, let it be this: Always verify everything produced by AI.
Eloise Turnbull
Associate at Sajen Legal
1Mata v. Avianca, Inc., F. Supp. 3d, 22-cv-1461 (PKC), 2023 WL 4114965, at *2