Many consider that 2023 was the breakout year for Artificial intelligence (“AI”).  We have been told that AI is here to stay and that it will soon be affecting our lives on an ever-increasing level.  A recent First-Tier Tribunal (FTT) Tax Chamber judgment is one of the first judicial decisions in the UK to consider the risk of relying on AI in a legal case.

 

In the case of Harber v Commissioners for His Majesty’s Revenue and Customs, Mrs Harber disposed of a properly but failed to notify HMRC of her liability to Capital Gains Tax.  HMRC therefore issued a failure to notify penalty which Mrs Harber appealed.  The basis of her appeal was that she had a “reasonable excuse”, namely because of her mental health condition and/or because it was reasonable for her to be ignorant of the law.

 

At the hearing, Mrs Harber, who was not legally represented, provided the FTT with nine FTT decisions which appeared to support her appeal, but which did not appear on the FTT website or on any other legal website.  Mrs Harber claimed that the cases had been provided to her by “a friend in a solicitor’s office” and she said that she did not have the full judgments or the FTT reference numbers.

 

After reviewing the cases carefully, the FTT concluded that all the authorities which had been submitted by Mrs Harber were fabricated and had been generated by an AI system such as ChatGPT.  The reasons for the FTT’s finding were:

 

  • None of the cases relied upon by Mrs Harber were on the FTT’s website or on any other legal website.

 

  • The cases were outwardly plausible but incorrect. It was noted that the FTT found in the appellant’s favour in all of the fictional cases, when in fact in the real cases to which the facts related, the appellant either lost the appeal or the penalty was charged for a different failure to notify than failure to notify liability to Capital Gains Tax as in the present case.

 

  • There were stylistic and reasoning flaws in the fabricated case summaries, for example six of the nine cases used the American spelling of “favor” and identical phrases were repeated in the summaries in all the cases.

 

As a result of the fabricated cases, unsurprisingly, Mrs Harber lost her appeal.

 

Although the FTT accepted that Mrs Harber was unaware that the AI cases were made-up and she did not know how to locate or check the authorities on the FTT website or other legal websites, the FTT nevertheless highlighted the harm that could be caused by seeking to rely on fictitious cases.  It would result in opponents having to incur time and expense to verify the cases, waste court time in dealing with unnecessary arguments and submissions and it also had the potential of undermining the trust and confidence in the judicial system.

 

This case should serve as a warning to litigants, and even to lawyers, of the risk of relying on information obtained from AI systems in the preparation of legal cases.

 

If you are involved in any civil or commercial dispute, then Blacks Solicitors can assist. Please call our Dispute Resolution team on 0113 207 0000.