WA Lawyer Referred to Regulator Over AI-Generated Fake Citations
A Western Australian lawyer has been referred to the state’s Legal Practice Board after relying on artificial intelligence to draft documents that included references to non-existent cases in an immigration matter.
The incident is part of a growing trend, with more than
20 similar casesreported in Australian courts since 2023, where judges have cautioned against the uncritical use of AI in legal practice.
In a judgment delivered this week, Federal Court Justice Arran Gerrard ordered the anonymised lawyer to pay $8,371.30 in costs to the federal government and referred him to the Legal Practice Board of Western Australia for consideration. The submissions prepared for the immigration case contained four fictitious case citations, which were identified by the minister’s legal team.
Justice Gerrard remarked that the situation “demonstrates the inherent dangers associated with practitioners solely relying on the use of artificial intelligence in the preparation of court documents and the way in which that interacts with a practitioner’s duty to the court”.
According to an affidavit filed in the matter, the lawyer admitted to relying on Anthropic’s Claude AI “as a research tool to identify potentially relevant authorities and to improve my legal arguments and position”, before using Microsoft Copilot to check the work. He acknowledged that he had “developed an overconfidence in relying on AI tools and failed to adequately verify the generated results”.
The practitioner further conceded: “I had an incorrect assumption that content generated by AI tools would be inherently reliable, which led me to neglect independently verifying all citations through established legal databases.” He apologised unreservedly to the court and the minister’s counsel.
While Gerrard noted that the court “does not adopt a luddite approach” to the use of AI and recognised its appeal in complex areas such as migration law, he stressed the risks. He warned that cases could be “undermined by rank incompetence” and that false citations “significantly waste the time and resources of opposing parties and the court” while harming the profession’s reputation.
He emphasised that lawyers must not only confirm the existence of cases but also engage with their substance, stating: “Legal principles are not simply slogans which can be affixed to submissions without context or analysis.”
It has been further reported that there have also been similar warnings in cases in
New South Wales and
Victoria. Just last week, a Victorian Supreme Court judge
criticized lawyers for a murder defendant after they filed misleading submissions that included fabricated case references and inaccurate quotes from a parliamentary speech.
The issue is not confined to legal practitioners. Earlier this month, NSW Chief Justice Andrew Bell commented on a
trusts case where a self-represented litigant admitted to using AI to prepare her oral submissions. Bell acknowledged her effort to represent herself but cautioned that AI-generated material “may introduce added costs and complexity” and “add to the burden of other parties and the court in responding to it”. He added that while generative AI might aid access to justice, “the present case illustrates the need for judicial vigilance in its use, especially but not only, by unrepresented litigants”.
Juliana Warner, president of the Law Council of Australia, said that while advanced AI tools could support the profession in administrative tasks, they did not replace a lawyer’s professional judgment. “Where these tools are utilised by lawyers, this must be done with extreme care,” she said. “Lawyers must always keep front of mind their professional and ethical obligations to the court and to their clients.”
Warner also noted that courts were treating fake citations as a “serious concern”, but cautioned that a blanket ban on AI would be “neither practical nor proportionate, and risks hindering innovation and access to justice”.
While no such case has yet been reported in Nigeria, the increasing reliance on generative AI tools by Nigerian lawyers makes these developments a cautionary tale. Practitioners in the country must remember that their duty to the court and their clients cannot be delegated to machines. AI may assist with drafting or research, but it is not a substitute for independent verification, sound legal reasoning, and professional judgment.