Australian lawyer who relied on fake AI cases spared disbarment, Placed Under Supervision
By Dr. Anne Uruegi Agi
An Australian solicitor who submitted fictitious cases generated by artificial intelligence has been allowed to remain in legal practice, though under strict supervision.
The Victorian Legal Services Board, regulator for the state’s lawyers, confirmed that the practitioner may continue to work as an employee of a law firm but only under oversight for a two-year period.
The disciplinary action stemmed from a complaint that the solicitor had presented a list of authorities produced with the aid of AI. The list contained non-existent cases and inaccurate summaries. While courts in England and Wales have encountered similar issues, regulatory sanctions in those jurisdictions are still pending, making the Australian approach a useful reference point for what may follow elsewhere.
In a judgment delivered in August 2024, the Federal Circuit and Family Court of Australia recorded that the solicitor, named as Mr Dayal, had relied on AI in compiling the list. The court heard he had offered an unreserved apology and accepted that his conduct fell below professional standards, although he denied any intention to mislead.
Mr Dayal told the court that he had misunderstood the operation of the AI tool and failed to properly check its output. Judge A. Humphreys acknowledged the pressure the solicitor had been under and accepted the behaviour was unlikely to be repeated, but nevertheless referred the matter to the regulator to underline the risks involved in the growing use of AI in litigation.
The Board’s
decision prevents the solicitor from serving as principal of a law firm, from handling trust monies, or from running a practice independently. During the two years of supervision, both he and his supervisor must submit quarterly compliance reports.
In a public statement, the regulator stressed: “This action demonstrates our commitment to ensuring that practitioners who adopt AI in their work do so responsibly and consistently with their professional obligations. We strongly advise lawyers to review our published guidance on the use of artificial intelligence in practice, and to consider further professional development if they intend to make AI part of their legal toolkit.”
Editorial Note
The decision highlights a growing challenge for the profession worldwide: how to balance the efficiency of emerging technologies with the duty of accuracy owed to courts and clients. For Nigerian and other commonwealth lawyers, this case serves as an early warning. Regulators are unlikely to excuse ignorance of how AI operates. The Australian regulator’s insistence on supervision, reporting obligations, and removal of independent practice rights signals a middle path—protecting the public while recognising that misuse may stem from poor understanding rather than deliberate dishonesty. Whether Nigerian regulators will adopt a similar stance when confronted with comparable misconduct remains to be seen, but the message is clear: AI is no substitute for due diligence.