HMRC integrating guidance with AI

30 April 2025

Amongst the dozens of measures announced on ‘simplification, administration and reform’ of the tax system on 28 April 2025 was the news that HMRC is set to collaborate more with third parties on its own AI tools.  

The aim is to work together with third parties so HMRC can help them utilise the huge amounts of guidance on the GOV.UK website in their AI products and services, around 100,000 web pages of content. That in turn could make it easier for taxpayers to access the information they are looking for more quickly and potentially reduce the burden on HMRC.  

Whilst HMRC cannot be criticised for the volume of information it has put into the public domain, some of it is simply impenetrable for the average taxpayer. The move to embrace AI early to try and help taxpayers with the tools they’re using is a positive one, provided it comes with a health warning.  

AI-enhanced tax research is a fast-developing area. Leading professional service firms, such as RSM UK, have been investing in creating their own AI models, as have dedicated tax content providers. These systems already allow tax professionals to search HMRC guidance and combine the results with their own knowledge bank of past research and specialist sources. Whilst these systems can, and do, aid initial research, they are no replacement for professional judgement and practical experience to separate fact from fiction, and law from guidance. 

AI will undoubtedly play an important role in streamlining the tax system in the future, but taxpayers should be wary of solely relying on it to determine their tax position, at least for some time. Many popular AI systems use a Large Language Model (LLM) which has its limitations when seeking an answer to a complicated tax question.  

The answers that an LLM produces are essentially its best statistical prediction of the answer based on the data it holds. It is easy to assume an AI tool is self-reasoning or knowledgeable in its own right, as it is often portrayed that way in Hollywood films. That may be the case in the future, but it is not the reality of how LLMs operate and they will not always give someone the right answer. It might, for example, give a different answer if a question prompt is slightly rephrased. Sometimes, it can repeatedly insist that an answer is correct when it is not, with some persistent persuasion required for it to admit otherwise.  

Indeed, an early example of an AI tool’s limitations in supporting a taxpayer was highlighted in the case of Felicity Harber vs HMRC, which we have written about previously. The taxpayer sought to rely on decisions in other court cases to support her position, but it transpired none of the cases were real, with the taxpayer conceding it was possible an AI tool was at fault for this. 

Many current AI services are only as good as the data they hold and given many individuals are already embracing such technology, HMRC should be applauded for being proactive in collaborating in this space. HMRC’s approach may lead to better, and more accessible models being created – potentially for direct consumption by taxpayers who may find it all too easy to trust AI-generated content without appropriate review and safeguards. If, or more likely, when this reliance results in an incorrect answer, it remains unclear who will be responsible for the 'advice' given and what protections the taxpayer will have from the AI provider. 

It is therefore important that taxpayers understand how best to use AI tools and do not assume this will protect them from potential penalties should they subsequently get things wrong with HMRC. AI can provide important support in researching an issue but taxpayers would be wise not to rely on it to provide the answer itself, it might just make it up.