AI Use Policy
How Duan & Duan UK LLP uses artificial-intelligence tools in client work — and the boundaries we apply.
Purpose
This policy explains how Duan & Duan UK LLP uses artificial-intelligence tools in the course of its client work, and the safeguards we apply. It is written in plain language for clients, prospective clients, and counterparties, and reflects guidance issued by the Solicitors Regulation Authority and the Law Society of England and Wales on the use of generative AI in the legal profession.
Our principles
Our use of AI is governed by three commitments:
- AI does not formulate legal advice. AI tools are used as aids to legal research, drafting, translation, and administrative support. A qualified solicitor retains full responsibility for every piece of advice we give. AI output is reviewed by a human before it is relied on or shared with a client.
- Client-confidential information is protected. We do not enter client-confidential information into any public, free, or consumer-tier AI tool. Where AI is used on client work, we use only enterprise-licensed platforms that contractually guarantee our inputs will not be used to train their underlying models and that apply appropriate data-residency and access-control protections.
- Our professional obligations are unchanged. Our duties under the SRA Standards and Regulations — including our duties of competence, confidentiality, and acting in our clients' best interests — apply equally whether or not AI has been involved in producing a piece of work.
Which tools we use
The firm presently uses the following enterprise-licensed AI platforms, each chosen for its data-privacy commitments and its suitability to a specific set of tasks:
- Thomson Reuters CoCounsel — a legal-specific AI assistant used for legal research, document review, summarisation, and deposition-style analysis of large document sets. Operates under Thomson Reuters's enterprise terms.
- Claude for Teams (Anthropic) — used for longer-form drafting support, document analysis, and bilingual (English–Chinese) assistance. Anthropic's commercial terms provide that inputs are not used to train Claude models.
- Microsoft 365 Copilot — used inside our Microsoft 365 tenant for productivity tasks (email drafting, document search, summarisation). Operates within the firm's own data boundary; Microsoft's commercial terms provide that tenant content is not used to train foundation models.
- ChatGPT Enterprise (OpenAI) — used for general drafting assistance, summarisation, and structured analysis. OpenAI's enterprise terms provide SOC 2 Type 2 controls and commit that enterprise data is not used to train OpenAI's models.
We do not use consumer-tier or free-tier versions of any AI service for client work.
What AI is used for
In practice, AI is used at Duan & Duan UK LLP for:
- legal research across English, EU, and PRC sources;
- drafting assistance on correspondence, internal memoranda, and first-draft sections of longer documents;
- document summarisation and first-pass document review;
- bilingual translation drafting between English and Chinese (with qualified human review before any translation is relied on);
- administrative tasks such as scheduling, email triage, and meeting summaries.
What AI is not used for
AI is not used to:
- deliver final legal advice to a client;
- approve or sign off a document or a settlement position;
- act in place of a qualified solicitor at any point where professional judgement is required;
- input material client-confidential information into a tool whose terms do not provide enterprise-grade privacy protection.
Transparency with clients
Clients are entitled to ask at any stage whether and how AI tools have been used on their matter, and we will answer openly. Where we consider AI has played a material role in producing the work product a client sees, we disclose that on request.
Review
This policy was prepared and last reviewed on 18 April 2026. It will be reviewed at least annually and when the set of tools we use materially changes, or when the SRA or the Law Society issues new guidance.