The Court of Appeal recently issued guidance addressing the risks associated with the use of AI in legal proceedings and they have set out principles to be applied in the use of such technology in a litigation context.
Court of appeal
In the recent case Guerin v O’Doherty [2026], Costello P of the Court of Appeal addressed the fact that the defendant lay litigant had used AI to prepare her written submissions. These materials referenced authorities which did not exist ("hallucinations" generated by the AI system), which Costello P noted is "an inherent and well-known risk of using AI to write legal submissions". Such misuse of AI "leads to completely wasted time and costs", and "casts an unfair burden on the opposing party" as it requires them to "fruitlessly…attempt to locate the hallucinated authorities". Costello P also noted that such use of AI "potentially brings the administration of justice into disrepute and may result in misleading the court".
The Court of Appeal set out the following principles for the proper use of AI in legal proceedings:
- Parties are entitled to use AI to assist in carrying out research in respect of their case - provided they do so responsibly and do not, even inadvertently, mislead the court by advancing propositions or relying upon supposed authorities which in fact have no foundation or are "hallucinations".
- Where parties opt to use AI in the preparation of a case, they should expressly inform both the other parties to the case and the court of their use of AI in this regard.
- A lay litigant is just as responsible for the written or oral work in their case as lawyers are.
- Any party who uses AI as part of their research must independently verify the accuracy of their submissions and the authorities cited as supposedly establishing the propositions advanced.
- No authority should be cited by a party unless it is a genuine judgment of the court and supports, or arguably could support, the proposition advanced.
The Court of Appeal emphasised that the use unverified sources in legal submissions leads to completely wasted time and costs, and casts an unfair burden on the opposing party in the preparation of replying submissions and books of authorities.] The Court noted that it can impose "a variety of sanctions" where parties use AI in breach of these guidelines and where such use has the potential to mislead the court. However, no sanction was imposed on the defendant in this case as Costello P did not believe that there was an intention to mislead the court. In addition, when the defendant was preparing her submissions, no guidance was available to lay litigants on the use of AI generated material in legal proceedings.
The Guerin judgment expands on the dicta in the recent case of Von Geltz v Kelly [2026], where the Court of Appeal addressed the fact that the plaintiff's submissions were littered with propositions of law which were "unsupported by authority; reference to authorities which have nothing to do with the asserted propositions of law; wrong citations; and a few non-existent cases." The Court of Appeal in that case noted that every litigant has a responsibility to check the output of whatever tool is used, to ensure that “opponents were not sent on a wild goose chase and ultimately that the Court was not presented with rubbish”.
Labour Court and WRC
The Labour Court also recently published guidance in response to the growing use of AI by parties to litigation. The Labour Court has advocated for the exercise of caution, stating that parties should "be aware of the limits, risks and shortcomings of any AI programme/tool which they use in the preparation of their case". Such risks may include:
- Hallucinations;
- Risks to confidentiality when using a public AI tool (such as ChatGPT);
- Inaccurate or biased underlying data. Users should be aware that information obtained via AI tools may not be up to date or relevant to Ireland.
Similarly to the Court of Appeal's guidelines, the Labour Court emphasises that all parties, irrespective of representation, are responsible for the documentation they put before the court. Any submission that is wrong or misleading because of reliance on AI (or otherwise) may negatively impact the relevant party's case.
This guidance follows the publication of similar guidance by the Workplace Relations Commission ("WRC") in October 2025, as well as a number of recent WRC cases involving the unchecked use of AI. In Fernando Oliverira v Ryanair [2025], the Adjudication Officer noted that the Complainant's "submissions were rife with citations that were not relevant, mis-quoted and in many instances, non-existent", and that "a considerable amount of time" was wasted in seeking to establish the veracity or otherwise of legal citations.
Notably, the Labour Court and WRC guidance states that parties may wish to reference the fact that AI was used in the preparation of submissions, as this "promotes transparency and helps the WRC understand [sic] your submission". This principle of optional disclosure differs from the mandatory obligation established by the Court of Appeal in the Guerin judgment, to "expressly inform both the other parties and the court of their use of AI".
Key takeaways
Recent court guidance and case law demonstrates that the misuse of AI in litigation is currently a cause of judicial concern. While the Court of Appeal in Guerin v Doherty did not specify the sanctions likely to be applied by the court in instances of misuse of AI, it is clear that such conduct will, at the very least, undermine a party's credibility, increase legal costs and contribute significantly to a negative case outcome.
There is now an express obligation on all parties before the Court of Appeal to disclose the fact that AI has been used in the preparation of legal submissions. While it remains to be seen whether other Irish courts will mandate the disclosure of AI use, it may be best practice for parties to litigation to put in place appropriate safeguards and ensure compliance with the stricter regime set out by the Court of Appeal.
For more information, please contact Aisling Duffy, Brónagh Carvill or your usual contact in Beauchamps LLP.