LONDON, May 11, 2026, 20:02 BST
Britain’s National Health Service is giving outside workers—including Palantir Technologies Inc. contractors—sweeping access to patient-identifiable data as they develop a high-profile health-data platform, the Financial Times said. That move is stirring up an old privacy debate over one of the U.S. software giant’s most sensitive public sector contracts. The permissions extend to the National Data Integration Tenant, or NDIT, a staging layer inside the Federated Data Platform where records remain identifiable before any pseudonymisation, that is, before personal details are hidden.
Palantir’s timing couldn’t be much worse. The stock has turned into a benchmark for appetite in AI-powered software across public and private sectors. Just last week, Palantir posted 85% revenue growth, hitting $1.63 billion, and bumped up its 2026 sales outlook to a range of $7.65 billion to $7.66 billion. On Monday afternoon, shares slipped 1.1% to $136.27 on the Nasdaq.
The NHS deal is significant—Palantir’s expansion hinges in part on public-sector contracts that aren’t easily swapped out, and here, trust rivals software quality. Palantir’s core product lets clients unify massive datasets and apply AI to them; in healthcare, we’re talking about records, waitlists, scheduling, and a sprawl of behind-the-scenes operational data patients rely on but don’t really interact with.
Some MPs and activists warned the decision may fuel unease about the company’s presence in UK public services. Labour’s Rachael Maskell labeled it a “dangerous development.” Martin Wrigley, a Liberal Democrat MP, argued the project seems to put data privacy somewhere below the “first concern.” The Guardian
According to NHS England, Palantir acts as a “processor” under data law—it handles information strictly on instructions from an NHS user, while a designated “controller” determines the reasons and methods for data use. NHS officials say access depends on user role and task; the data itself stays inside the UK. Palantir, under its contract, is barred from commercialising NHS data or using it to train AI systems. NHS England
Palantir leads the consortium that landed the contract back in November 2023, a deal pegged at £330 million across seven years and potentially spanning 240 NHS organisations. According to NHS England’s contract explainer, just the first three years are locked in, with a review of the initial term set for March 2027.
Palantir pitches the NHS platform as a solution for linking up siloed health-service data, promising quicker access to vital information for clinicians, analysts, and operations teams. The company, in a blog post last month, described the system’s federated approach: each NHS organisation operates its own segment, retaining full control over local data.
Palantir’s U.S. operations continue to drive results. First-quarter U.S. commercial revenue surged 133% to $595 million, and U.S. government revenue climbed 84% to $687 million. CEO Alex Karp called the U.S. business “erupting” in his letter to shareholders. Reuters
Palantir is seeing rivals edge into its territory. OpenAI announced Monday it’s launching a deployment company, pumping in over $4 billion to place engineers directly within organizations and fast-track enterprise AI rollouts. That approach looks a lot like Palantir’s signature, boots-on-the-ground model for embedding its software with big, complicated clients.
Palantir’s enviable run in the public sector and its lofty share price are now getting squeezed from two directions: mounting privacy questions tied to its government work, and fresh doubts about its valuation on Wall Street. Jefferies’ Brent Thill, in comments picked up by TipRanks, argued Palantir’s business remains solid but warned the stock demands what he called a “heroic durability assumption.” His take: risk versus reward is still “unfavorable.” TipRanks
The NHS dispute isn’t about losing contracts—this is about control. Palantir insists it won’t use patient data outside the scope of what customers instruct. Critics push back, arguing that before any sensitive health records are processed in systems involving private AI contractors, the public deserves stricter boundaries.