25 November 2025

Fake cases, real consequences: Wollongong lawyer weighs in on AI in the legal system

| By Dione David
Start the conversation
A humanoid robot holds a set of scales

Generative AI is rewriting everything — except the law, where human judgement still matters most. Image: style-photography.

When court rules quietly dropped late last year, banning the use of generative AI in affidavits, solicitor Graham Lancaster wasn’t surprised.

The director of Lancaster Law & Mediation says AI-generated legal blunders have increased with the rise of generative AI, particularly the creation of fake legal citations and other errors in court documents.

The new Supreme Court practice note, called SC Gen 23, is an example of how the law is adapting and similar rules apply across other courts.

“The law is regulating the use of AI and slowing down the adoption of it, and that will enable courts to see it in action,” he says. “But I can see a day when it’s used more readily.”

SC Gen 23 leaves little room for misinterpretation: AI can be used for simple tasks such as spellcheck, but there can be no AI-generated evidence, no substantial content and certainly no reliance on systems known to hallucinate.

“Generative AI can’t understand evidence the way humans do. It will create plausible, authoritative and coherent responses that are inaccurate or fictitious,” Graham says.

“And if we’re talking about open source, you are potentially breaching client confidentiality by introducing material into an open source platform that then may be communicated with other people.”

READ ALSO Business going to the wall? Here’s when liquidation might not be the escape plan you’re after

The few exceptions require formal leave of the court. Lawyers must identify the specific program they want to use, explain its privacy settings, declare whether it’s open or closed source and justify the benefit of its use.

For Graham, the caution is warranted. He has personally experimented.

“When I’ve tested it, AI has generated cases that I could not verify existed,” he says. “Using AI alone for the purpose of advising a client is dangerous. Using it for summarising a document is about as far as it’s capable of going — and sometimes even its summaries can be skewed.”

It is not a hypothetical concern. Courts in Australia and overseas have uncovered affidavits and submissions supported by fictitious authorities — imaginary cases presented with total confidence.

The consequences can be serious, including wasting the time of the court, the parties and their legal practitioners in searching for case law or legal principles that simply do not exist.

“If an affidavit is found to have been created using generative AI, there’s a good chance it could be excluded from evidence,” Graham says.

Worse, witness credibility can be destroyed and lawyers involved could face disciplinary proceedings.

“Because you are breaching a rule or direction of the court, there is a potential, if found to be deliberate, for a finding of contempt of court.”

Graham Lancaster from Lancaster Law & Mediation

Graham Lancaster says while the law is not immune to the ubiquity of AI technology, it’s no substitute for a solicitor’s skills. Photo: Lancaster Law & Mediation

He points to a recent matter tried by Lancaster Law & Mediation, where opposing lawyers used an AI tool to analyse four years’ worth of emails, producing a table purporting to show who sent what.

“It produced very clear errors, which we were able to point out readily, so they wasted their time,” he says.

“I can see the appeal of such tools. Imagine a program reading every email and summarising it – it could save the courts significant time.

“In transactional matters, there’s nothing preventing AI from creating documents that are then checked by a human. But the skill of the legal practitioner should always be paramount.

“And if judges were encouraged to adopt AI, we might then see errors in reasoning and logic for the simple fact of convenience.”

READ ALSO Graham Lancaster receives rare honour for commitment to Illawarra’s business community

For now, NSW judges are not permitted to use generative AI to help write their judgments or even to analyse evidence before delivering them.

What’s less certain is the impact in the higher education space and what AI means for the next generation of practitioners.

“I think we’re already seeing a loss of skills development at university,” Graham says.

“A law student can ask AI to read a case and summarise it. On the surface, it might seem a good use of AI, but it means they don’t read the case.

“Over time, the convenience of certain technologies will erode the core abilities lawyers rely on: analysis, negotiation, the ability to read people and professional judgement.”

In the end, Graham argues, the most important legal tool is one AI can’t emulate.

“There’ll be times where you’re about to do something and this little voice inside you says, ‘Hang on, is that right?’,” he says.

“AI doesn’t have that intuition, based on experience, yet it has access to all of the internet.

“Until it evolves further, AI is a tool that requires the experience, oversight and instincts of human legal practitioners for its benefits to be fully realised.”

For more information, visit Lancaster Law & Mediation.

REGION MEDIA PARTNER CONTENT

Start the conversation

Daily Digest

Want the best Illawarra news delivered daily? Every day we package the most popular Illawarra stories and send them straight to your inbox. Sign-up now for trusted local news that will never be behind a paywall.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.