AI is a tool, not a substitute for expertise, MHC lawyers have said in a note on the firm’s website.
It points to the Workplace Relations Commission (WRC) publication of guidance on the use of artificial intelligence (AI) tools in preparing written submissions and documents for employment and equality-law cases.
The move follows a highly publicised case involving Ryanair and a former member of its cabin crew, in which an adjudication officer sharply criticised the complainant’s reliance on AI-generated legal arguments.
The case, Fernando Oliveira v Ryanair DAC, drew attention when it emerged that the complainant had used an AI drafting tool to prepare his written submissions.
Non-existent cases
According to the adjudication officer, this resulted in the inclusion of citations that were “not relevant, misquoted and in many instances, non-existent”.
At least two of the decisions cited were found to be AI “hallucinations”, referring to cases that did not exist in reported WRC decisions.
The WRC’s response has been the publication of detailed guidance aimed at helping parties understand the risks associated with using AI systems for legal drafting.
Though acknowledging that AI tools may assist in structuring submissions or producing early drafts, the WRC warns that such tools should never be treated as a substitute for legal advice or specialist knowledge of Irish employment law.
Appearance of 'polish'
According to the guidance, most general-purpose AI tools are not trained on Irish employment or equality legislation and lack familiarity with WRC procedures.
As a result, they may produce content that appears polished and confident but does not reflect legal reality. The guidance emphasises that legally inaccurate or misleading information remains the responsibility of the party submitting it – even where that information originated from an AI system.
Another significant concern is data protection and confidentiality. Many popular AI systems store or process user inputs to improve their models, meaning that sensitive personal data or commercially confidential information could be inadvertently disclosed.
The WRC cautions users to avoid inputting such material into public AI platforms.
The commission also stresses that inaccurate citations or irrelevant arguments may cause wasted time, delays, and potential prejudice to the other party, as occurred in the Oliveira case, where both Ryanair and the adjudication officer wasted time verifying the authenticity of the cited decisions.
Optional disclosure of AI use
In a move aimed at promoting transparency, the WRC has proposed an optional disclosure statement that parties may include in their submissions:
“Parts of this submission were drafted using an AI writing tool. I have reviewed and confirmed the accuracy of all content.”
While voluntary, the WRC says this statement can assist adjudication officers in understanding how submissions were prepared and may help avoid misunderstandings about the provenance of legal arguments.
Consequences of misuse
The guidance follows comments from the High Court in Erdogan v Workplace Relations Commission, in which Mr Justice Simons affirmed that adjudication officers were entitled to ensure that hearings proceeded efficiently by requiring parties to confine themselves to relevant issues.
Submissions containing inaccuracies – whether created by humans or AI – may be deemed inadmissible, undermine a party’s credibility and weaken their case.
The WRC notes that parties must be able to fully stand over their submissions.
The guidance offers several practical recommendations for parties preparing submissions:
While AI tools can be used to support litigation preparation, it is not a substitute for understanding the law and applying it accurately, the MHC lawyers say.