Contract grants AI firm access to critical financial data in Britain
Palantir’s latest UK agreement provides the AI and data analytics company with access to extensive data from one of the world’s most significant financial hubs.
Palantir expands influence within British government sectors
The Miami-based firm has embedded its technology across key UK institutions: the NHS in 2023, police forces in 2024, and the military in 2025. Following a common tech industry strategy of "land and expand," Palantir has secured contracts exceeding £500 million in value.
In 2026, the company’s new deal with the Financial Conduct Authority (FCA) allows it to analyze vast amounts of data collected by the regulator, granting it an unprecedented perspective on the operations of British authorities. This also provides insight into the City of London, a major global financial center.
Drivers behind public authorities’ interest in Palantir’s technology
Public sector interest in companies like Palantir is fueled by three main factors: the need to optimize human resources amid budget constraints; the accumulation of large datasets due to increased digitization of transactions and communications; and the emergence of AI technologies, supported enthusiastically by the Labour government for their potential to stimulate economic growth.
Despite previously employing Peter Mandelson’s lobbying firm, Global Counsel, Palantir has become a significant influence within Whitehall. With revenues of $1.4 billion in the last quarter of the previous year alone, the company attracts top talent and impresses many with its AI-powered data analysis during demonstrations. Although campaign groups criticize Palantir’s collaborations with the US Department of Homeland Security, Immigration and Customs Enforcement (ICE), and the Israel Defense Forces, the company continues to secure contracts.
Palantir’s role at the FCA and the challenge of financial crime detection
Palantir’s specialists will be based at the FCA headquarters in east London, where the regulator is concerned about dedicating excessive resources to investigating financial crime cases that yield limited results. The FCA aims to leverage AI to improve detection of illicit activities, enabling more effective action against serious crimes such as money laundering, which underpins issues like human trafficking and drug trafficking, as well as fraud, which constitutes approximately 40% of all crimes in the UK.
The FCA’s 2025-26 workplan outlines goals to "expand the use of data and intelligence to identify and act on the riskiest firms and/or individuals" and to employ "network analytics to identify harmful networks of firms and/or individuals." However, as AI is increasingly used to detect financial wrongdoing, criminals may develop methods to evade these technologies.
"If the FCA relies on an AI-based detection model, a bad actor could take steps to influence that system when it reviews material," said Christopher Houssemayne du Boulay, a partner and barrister at the law firm Hickman & Rose specializing in serious and complex financial crime.
He explained that criminals might manipulate documents to instruct the AI to disregard incriminating information.
"You can absolutely see that being used in a financial crime context because developments in technological capabilities for good can equally well be exploited by criminals and frequently are exploited very well," he added.
The long-anticipated use of AI in combating money laundering
The application of AI to fight money laundering has been anticipated for decades.
"People have talked about using machine learning and earlier forms of artificial intelligence to spot patterns of money laundering since the 1990s," said Professor Michael Levi, an internationally recognized money laundering expert at Cardiff University. "Now that technology is available, we have to make decisions about how to use it, what the risks are."
Professor Levi acknowledged concerns about privacy risks arising from data companies integrating diverse datasets.
"Criminals are also afraid of it [and] also some elites might be afraid, because corporate holdings through shell companies and through real companies with obscured ownership should be part of the target for these kinds of technologies."







