Law Society of WA

New AI guidelines provide clarity, but disclosure catch-all causes concern 

November 21, 2025

The Supreme Court of Western Australia has released Guidelines for the use of artificial intelligence (AI) in proceedings before the court. The Guidelines apply to lawyers, non-lawyers, self-represented litigants and expert witnesses in civil and criminal proceedings. 

The publication of the Guidelines follows the Supreme Court’s consultation in March this year on a proposed practice direction to regulate AI use in legal proceedings. The Law Society provided a submission in April, urging the court to adopt flexible, principles-based guidance, rather than a proscriptive practice direction. It is positive to see the court embrace this approach, which encourages the ethical and responsible use of AI by all court users.   

A Law Society survey in April revealed more than half of respondents used AI tools in every day legal practice. The most common applications were legal research, practice management, generating legal documents, discovery and precedents. Litigation lawyers told the Law Society they used AI to create research notes and advice, chronologies, written submissions, affidavits, minutes of orders, witness statements, pleadings and documents to initiate proceedings. 

The Guidelines include a list of the risks of AI use for court users to consider, such as whether the AI tools are biased, provide outdated and incomplete information, generate inaccuracies and/or fabrications, and the danger of results that fail to take into account the nuances between different jurisdictions.  

 The court reminds all court-users to be mindful of: 

  • Privacy and confidentiality obligations 
  • The capabilities and limitations of AI tools 
  • The need for human verification of all AI generated content 
  • Ensuring AI use does not mislead the court or any other parties as to the way documents have been produced, and 
  • Following directions from the court about disclosure of AI use.  

When preparing or filing documents in Supreme Court proceedings: 

  • If you use AI to assist you, you must verify the content of the document and take legal responsibility for it 
  • If you certify or file documents or rely on the contents of a document, you are responsible for that document’s accuracy and any omissions or errors in it, regardless of whether AI was used to prepare the document 
  • You must exercise caution when preparing affidavits and witness statements to ensure AI content does not replace a witnesses’ own words and that the evidence reflects the witness’s personal knowledge 
  • Expert reports and opinions must comply with existing practice directions. Experts producing these reports and opinions should consider their obligations to disclosure AI use in these documents.   

For legal practitioners, the Guidelines confirm existing professional obligations apply to the use of AI in court proceedings in that:  

  • You must verify that documents do not contain misleading information  
  • Your documents must contain a proper basis for all legal and factual contentions placed before the court 
  • You are personally responsible for ensuring you have fulfilled your duties to the court when filing or signing a court document, regardless of how the document was prepared. 

The Guidelines’ final direction – “Accordingly, when directed by the court or where otherwise necessary or appropriate, the use of generative AI in Supreme Court proceedings (including the preparation of any materials) should be disclosed to other parties and the court”creates a measure of uncertainty for court-users in the absence of a procedural direction. What is meant by “otherwise necessary or appropriate” and “the preparation of any materials” is open to interpretation, particularly for self-represented litigants unfamiliar with the civil and criminal procedure rules. How this aspect of the Guidelines will be applied will be interesting to watch. 

Ultimately, the Guidelines encourage lawyers to be vigilant in ensuring compliance with ethical obligations and duties when using generative AI to prepare documents in court proceedings.   

As the Honourable Chief Justice Peter Quinlan recently observed, “While the challenges posed by artificial intelligence to the legal process are no doubt different to those of the past, and must be met with new solutions, the underlying concerns are the same: authenticity, intelligibility and immediacy. Our legal tradition has dealt with these concerns before and has developed rules designed to ensure that the human character of the law and the legal process is retained and preserved.”  

Generative AI may change the way we practise law, but it does not change lawyers’ ethical duties and obligations to the court, to clients, colleagues and other parties.

Previous Story

Alternative dispute resolution: the hidden gem of conflict resolution

Next Story

WA Government releases discussion paper on proposed Multicultural Act

Discover more from brief.

Subscribe now to keep reading and get access to the full archive.

Continue reading