A few bad robots

12 December 2023

In what appears to be a first in a tax case, a taxpayer has sought to rely on information that was determined by the court to have been produced by artificial intelligence (AI) software. In Felicity Harber v HMRC, Mrs Harber failed to notify HMRC of a liability to capital gains tax (CGT) and was issued with a penalty of £3,265. Mrs Harber sought to appeal her penalties on grounds she had a reasonable excuse due to a mental health condition suffered at the time and that it was reasonable for her to be unaware of the law.

Ahead of the court hearing, Mrs Harber prepared a document with the help of a friend that referenced nine other First-tier Tribunal (FTT) cases, seemingly supporting her position. These cases appeared to highlight other situations in which a taxpayer had successfully established a defence of reasonable excuse due to either ignorance of the law or on grounds concerning mental health. 

There was however a gremlin in the system: all the cases appeared to be made up. 

In her preparation for the case, Mrs Harber had been assisted by a ‘friend in a solicitor’s office’ and only provided summaries of the nine cases she referred to. When HMRC and the judges were unable to identify the full texts of the cases cited, Mrs Harber conceded it was ‘possible’ that the cases had been generated by an AI system, such as ChatGPT. In response to this line of questioning, Mrs Harber sought to contend that ‘she couldn’t see that it made any difference, as there must have been other FTT cases’ that supported her case. 

In a world where information is available at a touch of a button, it seems it has also never been easier to be misinformed. As was noted in the case, ‘providing authorities which are not genuine and asking a court or tribunal to rely on them is a serious and important issue’. 

What should have been a relatively straightforward decision based on the law, appears to have been made much more complex as the FTT had to reasonably determine whether Mrs Harber’s cases were genuine or not. The difficulties are summarised in a previous judgment in the US in which barristers relied on fake caselaw generated by ChatGPT:

‘Many harms flow from the submission of fake opinions. The opposing party wastes time and money in exposing the deception. The Court’s time is taken from other important endeavors. The client may be deprived of arguments based on authentic judicial precedents.’ 

This is likely to be the first of many occasions in which AI makes its way into the courtroom. With information available at a touch of a button, it will be increasingly tempting for taxpayers to avoid seeking professional advice from a solicitor or accountant, like Mrs Harber, due to it being ‘expensive’. As a minimum though, taxpayers should not be putting all their faith in machines and need to back that up with their own research. HMRC may be pushing more taxpayers towards its digital services but clearly a human touch is still needed.