Services A-Z     Pricing

Impact of artificial intelligence on tax disputes

25 January 2024

In the modern world, society is always looking for improvements and automation to, supposedly, reduce costs and increase speed (indeed the author is indebted to the artificial intelligence (AI) system used to suggest the title for this article). This applies no less to the world of tax (as well as other forms of) litigation, where adviser and legal fees can be difficult to stomach and the prospect of a brainy robot efficiently cutting to the chase may hold some appeal. A recent tax case, however, will have heartened AI-doom mongers who argue a reliance on AI is not always the smartest solution.

Felicity Harber (TC9010) is a decision of the First-tier Tribunal (Tax Chamber) from December 2023 relating to a penalty for a failure to notify a liability to capital gains tax (CGT). The taxpayer appealed the penalty on the basis that she had a reasonable excuse because of her mental health condition and/or because it was reasonable for her to be ignorant of the law.

Mrs Harber acted as a litigant in person in taking on HMRC. This means that she did not rely on any professional legal support and instead, opted to argue the case herself in the tribunal – including filing all pleadings and providing her own oral submissions before the judge. In relying on AI, Mrs Harber said she had not taken any ‘professional advice from an accountant or solicitor about the CGT position, because this would have been “expensive”’.

AI faux-pas
Instead, as part of her preparation for the hearing, Mrs Harber relied on AI (assumed to be ChatGPT) to assist in pulling together the strongest arguments against HMRC’s position. Her submissions quoted a number of judgments which had useful and relevant points to support her argument that a penalty (for failure to notify of her liability to CGT following the disposal of a property) should be withdrawn.

Unfortunately, however, the appellant inadvertently relied on nine non-existent tribunal cases to support her arguments. She also failed to provide a copy of her arguments – which included the reliance on fictitious cases – to HMRC when she filed it with the tribunal some months before the hearing.

The tribunal, thankfully, ‘accepted that Mrs Harber had been unaware that the AI cases were not genuine and that she did not know how to check their validity by using the First-tier Tribunal website or other legal websites’.

Understandably, the tribunal clarified that such an error is not harmless as it ‘causes the tribunal and HMRC to waste time and public money, and this reduces the resources available to progress the cases of other court users who are waiting for their appeals to be determined’.

The tribunal also referred to the US case of Mata v Avianca 22-cv-1461(PKC), in which two barristers sought to rely on fake cases generated by ChatGPT. Indeed, in rejecting Mrs Harber’s argument that the submission of incorrect judgments makes no difference to the case, the tribunal agreed with Judge Kastel’s view in Mata that ‘many harms flow from the submission of fake [judgments]’ including the wasting of the opposing party and the court’s time, the harm to the reputation of judges and courts falsely attributed to ‘bogus’ judgments, and more importantly it ‘promotes cynicism about the legal profession and the … judicial system’.

Lucky escape
Although Mrs Harber was unsuccessful in her appeal, it is easy to imagine that the consequences would have been much more severe had she not been acting as a litigant in person – a wasted costs order under Rule 10(1)(a) of The Tribunal Procedure (First-tier Tribunal) (Tax Chamber) Rules 2009 would probably have been high up on HMRC’s wish list at the end of the hearing.

Equally, Mrs Harber should be thankful that the tribunal was alive to the possibility of this sort of issue coming up. Indeed, the reference to the Solicitor’s Regulation Authority commentary on the use of AI systems was mentioned which stated:

‘All computers can make mistakes. AI language models such as ChatGPT, however, can be more prone to this. That is because they work by anticipating the text that should follow the input they are given, but do not have a concept of “reality”. The result is known as “hallucination”, where a system produces highly plausible but incorrect results.’

Although it may seem objectively quite difficult to understand what would amount to a ‘hallucination’ when using such AI models for legal research, it is submitted that this comes with practical experience of (tax) law which, at least for a little while longer, is still of paramount importance.

Underlying reasonable excuse
In terms of the underlying case itself, in order to consider whether there was a reasonable excuse, the First-tier Tribunal applied the fairly clear principles set out in Christine Perrin v CRC [2018] STC 1302. Broadly, this meant it had to:

  • establish the facts the taxpayer asserts give rise to a reasonable excuse;
  • decide which of those facts are proven;
  • decide whether, viewed objectively, those proven facts do indeed amount to an objectively reasonable excuse; and
  • having decided when any reasonable excuse ceased, decide whether the taxpayer remedied the failure without unreasonable delay after that time.

In the tribunal’s view, the reasonable taxpayer in Mrs Harber’s position, who had previously obtained advice from HMRC, and who knew she had made a capital gain, would have contacted HMRC, TaxAid, an accountant or a lawyer to find out what she needed to do. Further, although Mrs Harber suffered from anxiety and panic attacks, this had not stopped her taking a number of steps to sell her property – liaising with agents, conveyancing solicitors, and dealing with lodgers – so the logic followed that this should not have prevented her from contacting any of the above who may have been able to assist her in reporting her capital gain.

To that end, it was held that the taxpayer’s ignorance of the requirement to notify her liability was not objectively reasonable.

Wider application to tax practitioners
With any luck, improvements to AI systems and a greater understanding of the risk of ‘hallucination’ will mean that cases such as this will be confined to history, to be wheeled out every so often at tax conferences as a warning to those preparing for tribunal hearings. Indeed, had the taxpayer used advisers for the appeal at least, one would expect them to have spotted the errors. Even if the errors had somehow still managed to make their way into the taxpayer’s pleadings, the adviser would have been expected to have filed and served documents properly and on time so HMRC would have spotted them far in advance of the hearing and it wouldn’t necessarily have made its way into a decision at all.

Perhaps the more relevant point to us all is understanding the inherent risk of similar mistakes creeping into other areas. In the modern world, where we are increasingly working away from our colleagues and are under increasing pressure to turn work around quickly, people might be more likely to rely on AI and computer based research when looking into niche points (rather than hard copies for example). In such scenarios, it may not be so unlikely that a well-intentioned tax practitioner could also be misled into relying on a ‘hallucination’.

Unlike the Harber case where the tribunal said ‘providing fictitious cases in reasonable excuse tax appeals is likely to have less impact on the outcome than in many other types of litigation’, the link between the use of erroneous AI driven research and sanction may be stronger. For example, when submitting something to HMRC, all sorts of penalty implications might come into play where one would not simply be able to say ‘I relied upon AI’ as some sort of viable defence.

This is important to understand for those tempted to revert to such quickly and easily available resources, even if dealing with ‘just’ an enquiry letter from HMRC. For the vast majority with a greater understanding of tax and who deal with it on a day to day basis, the risks would be obvious but for those who might not be as experienced, this case encapsulates the potential pitfalls. It is fair to say HMRC would be unlikely to treat such a shortcut as kindly as the tribunal treated a litigant in person in the above case.

Thankfully it seems (for the author’s sake at least), despite the wider use of AI in a number of industries, the death-knoll for tax advisers and lawyers is not yet ringing. 

This article was first published in Taxation on 16th January 2024. 

further information

For further information on the issues raised in this blog, please contact Waqar Shah or any member of the Dispute Resolution team

about the author

Waqar Shah is a Partner in the Dispute Resolution department, focusing on the resolution of complex tax matters. He acts for high net worth individuals and corporate clients across all sectors in respect of HMRC disputes and investigations across the full range of taxes. This typically includes VAT disputes, employment tax matters (including 'IR35'/off-payroll working), customs/excise duty issues, tax fraud investigations, and more recently, National Minimum Wage enquiries.

 

 

Share insightLinkedIn X Facebook Email to a friend Print

Email this page to a friend

We welcome views and opinions about the issues raised in this blog. Should you require specific advice in relation to personal circumstances, please use the form on the contact page.

Leave a comment

You may also be interested in:

Skip to content Home About Us Insights Services Contact Accessibility