, og:description, fb:app_id

Use of AI in Legal Practice

Thursday, 23 October 2025

Use of AI in Legal Practice

Guidance on the Use of Generative AI in Legal Practice 

The ACT Law Society recognises the growing use of generative artificial intelligence (Gen AI) tools — such as ChatGPT, Microsoft Copilot, and DeepSeek — in legal practice. While these tools can assist with drafting, summarising, and research, they also present significant risks. 

Practitioners are reminded that the use of Gen AI must be consistent with their professional obligations under the Legal Profession Act 2006 and the Legal Profession (Solicitors) Conduct Rules 2015 (the Conduct Rules). 

Key Considerations 

1. Maintaining Client Confidentiality (Rule 9 of the Conduct Rules) 

Practitioners must not input confidential, sensitive, or privileged information into public Gen AI tools. These platforms may store or use submitted data in ways that breach client confidentiality. If using AI tools for drafting or analysis, ensure the platform is secure and compliant with privacy obligations. 

2. Integrity and Professional Independence (Rules 4.1.4 & 17 of the Conduct Rules) 

Gen AI cannot reason, understand context, or provide legal advice. Practitioners must exercise their own professional judgment and must not treat AI-generated output as a substitute for legal analysis tailored to a client’s specific circumstances. 

3. Being Honest and Delivering Legal Services Competently and Diligently (Rules 4.1.2 & 4.1.3 of the Conduct Rules) 

AI tools are not a replacement for legal expertise. Practitioners using AI to assist with drafting or research must be qualified to assess and verify the output. Any document submitted to a client, court, or third party must be accurate, appropriate, and professionally reviewed. 

Practitioners are reminded that Gen AI tools are prone to “hallucinations” — that is, generating content that appears plausible but is factually incorrect or entirely fictitious. This includes fabricated case law, legislation, or academic references. Practitioners must not rely on AI-generated content without independently verifying its accuracy. 

4. Charging Costs that are Fair, Reasonable and Proportionate (Rule 12.2 of the Conduct Rules) 

If Gen AI is used to support legal work, practitioners must ensure that billing reflects the actual legal work performed. Time spent verifying or correcting AI-generated content should not result in unreasonable or inflated costs to the client. 

Practical Guidance 

  • Use AI tools selectively: Limit use to low-risk tasks that are easy to verify. Avoid using AI for complex, high-risk matters without thorough review. 

  • Be aware of bias and limitations: Gen AI tools may reflect biases or produce inappropriate content, particularly in sensitive areas like criminal law or family violence. 

  • Develop internal policies: Law practises should implement clear policies on: 

  • Which AI tools may be used; 

  • Who may use them; 

  • What types of information can be inputted; 

  • How AI-generated content is reviewed and verified. 

  • Supervise junior staff: Ensure that use of AI by junior or support staff is appropriately supervised and that outputs are reviewed by a qualified and suitably experienced practitioner. 

  • Inform clients: Where Gen AI is used in a matter, practitioners should be transparent with clients about how it was used and how it may affect costs or outcomes. 

  • Stay informed: Monitor updates from the courts, regulators, and the Society regarding the use of AI in legal practice. This includes Practice Directions, judicial commentary, and any changes to professional conduct rules. 

Related Links 

Artificial Intelligence and the Legal Profession – Law Council of Australia 

Role of AI in Legal Practice – Webinar Library  

2024 Annual Blackburn Lecture