What happens when… you use an AI generated lawyer to represent you in court?

A man in the US made headlines recently for trying to use an ‘AI lawyer’ to argue his case in court. This isn’t the first time the use of artificial intelligence (AI) in a legal context has caused controversy – a family lawyer in Victoria was referred to the complaints and disciplinary body in July last year and an immigration lawyer in New South Wales was referred to the Office of the NSW Legal Services Commissioner in February this year for filing court documents that contained AI-generated case citations to non-existent cases. So what are the rules about using AI in legal proceedings in Australia?

Each court in Australia has the power to issue practice directions/guidelines for how their courts operate and are used by litigants and lawyers. In relation to AI, the Supreme Court of Victoria has issued guidelines on the responsible use of AI, and the Supreme Court of Western Australia is in the process of consulting with the profession regarding the development of practice directions on the use of AI in proceedings before the court, for example. The guidelines in Victoria require parties to disclose to each other and the court when AI has been used.

Legal practitioners must also bear in mind their professional obligations to both their clients and the court – these include client confidentiality, avoiding any compromise to integrity and professional independence, a duty not to deceive or mislead the court and a duty to deliver legal services competently and diligently.

It is unclear how a court in Australia would deal with someone using an AI-generated lawyer in court proceedings. Victoria, NSW and WA have adopted the Legal Profession Uniform Law, together with the Uniform Conduct Rules. The Uniform Law provides that only ‘qualified entities’ can engage in legal practice, so the company responsible for creating the technology may be falling afoul of this section. However, these standards only apply to legal practitioners, not people who are representing themselves in court, so there may well be no way to formally reprimand a self-represented litigant for using an AI-generated lawyer to argue their case, other than potentially contempt of court. Courts may, however, refuse to place weight on submissions or documents that have been generated by AI. The risk of court disapproval (as was experienced in the US) may be sufficient to discourage this behaviour in future.

As in the Victorian and NSW examples provided above, a legal practitioner who does not comply with their professional obligations risks being referred to the relevant disciplinary body for misconduct. Most courts have the power to refer legal practitioners to these disciplinary bodies, but complaints can also be made by other members of the legal profession and clients. Penalties for a finding of misconduct vary, but include a fine, suspension and even ‘striking off’ (where a practitioner is removed from the roll or register and can no longer practice as a lawyer).

AI is everywhere and the legal profession is not immune to its effects and the potential to make some tasks quicker and easier. That said, lawyers need to be aware of the pitfalls associated with simply transposing AI-generated results into court documents without thoroughly checking their accuracy. Lawyers cannot simply defer or delegate their crucial role in the legal system to AI – ultimately AI cannot exercise legal judgment or professional skill. Self-represented litigants should also educate themselves about the limitations of AI and be conscious of the need to fact-check any information it provides.

Note that the content of this blog does not apply in all jurisdictions, does not constitute legal advice, and should not be relied upon. You should seek legal advice in relation to any particular matters you may have. All opinions expressed are our own, not necessarily those of any organisations with which we are connected.

Leave a comment