The fact that NHS England still considers Palantir an appropriate partner raises serious questions about NHS England’s
integrity.
The multiple contracts awarded to Palantir over recent years have brought with them allegations of favouritism by NHS executives, backdoor meetings, donations to the Conservative party, ministerial directives being used to
override patient confidentiality rules, and Palantir’s Peter Thiel’s own confession that the company is
“buying its way in”' to the NHS.
Patients and campaigners have been raising concerns about Palantir’s creeping involvement with the NHS for years based on concerns about ethics, outsourcing, and privacy. The tech company has a long and controversial history of supporting predictive policing, deportations, state surveillance, and drone strikes in Iraq and Afghanistan.
NHS England risks further losing the trust of health workers, patients, and the public if it continues with this contract with Palantir. On 3 April 2024, more than 100 health workers, patients, and allies picketed the offices of NHS England to demand that the contract is cancelled.
Outrage from health workers, patients, and the public will only grow as further atrocities are committed by the IOF.
If NHS England is to recover its own reputation and maintain public trust in health data systems, it must cancel the contract with Palantir.
Facial recognition and wider police surveillance issues
While separate from Palantir's known involvement in prisons, the growing use of facial recognition by UK police forces raises related concerns.
- Bias and misidentification: Studies have repeatedly shown that facial recognition technology is less accurate in identifying people of colour, raising the risk of biased policing and wrongful arrests.
- Privacy invasion: The police's use of live facial recognition in public spaces tracks millions of people, a practice that campaigners say violates fundamental privacy rights.
- Unlawful use and lack of regulation: In August 2025, the Equality and Human Rights Commission (EHRC) criticized the Metropolitan Police's use of live facial recognition, arguing it breached human rights law. Civil liberties groups say regulation has not kept pace with the technology's deployment.
Calls for action
Civil liberties organizations are actively campaigning against Palantir's expansion and the wider use of facial recognition technology.
- Protests: Activists have organized protests targeting Palantir's headquarters and highlighting its use of AI technology.
- Legal challenges: Legal action has been initiated against police forces using the technology, challenging its lawfulness.
- Parliamentary action: Cross-party coalitions of politicians, rights groups, and racial justice organizations have called for an immediate halt to live facial recognition surveillance.
Campaigners have called to stop Palantir's involvement in UK prisons and the wider justice system due to concerns about ethics, human rights, and the use of data analytics for predictive purposes. While there is no evidence that Palantir is directly deploying facial recognition technology in UK prisons, it has been lobbying the Ministry of Justice (MoJ) to apply its data analytics to the prison system. The wider expansion of government use of surveillance technology, including facial recognition, is a separate and significant source of controversy.
Palantir's known involvement in the UK justice system
- Predictive risk modeling: In late 2024, reports revealed that Palantir was in discussions with the MoJ about calculating prisoners' "reoffending risks" using data analytics. This was part of a larger lobbying effort by the tech company targeting UK government ministers.
- Wider police surveillance networks: In mid-2025, it was uncovered that Palantir was working with police forces across the East of England to build a real-time data-sharing network. The network pools personal data on vulnerable victims, children, and witnesses alongside suspects, sparking outrage from civil liberties groups.
- Government-wide ambition: The company has also attended meetings with the MoJ where other tech firms suggested invasive surveillance methods, including subcutaneous tracking of offenders.
Concerns surrounding Palantir and predictive technology
- Human rights abuses: Critics, including Amnesty International, warn that data-driven and predictive policing technologies from companies like Palantir raise severe human rights concerns. Palantir has faced backlash for its work with Immigration and Customs Enforcement (ICE) in the U.S. and for allegedly killing aid workers in Gaza using its targeting software.
- Ethical breaches and unaccountability: Campaigners argue that the lack of regulation and oversight over private tech companies like Palantir creates loopholes for government agencies to bypass safeguards. Critics also express alarm at Palantir's history of secrecy and perceived contempt for human rights.
- Risk of systemic error: The 2023 Post Office scandal, a miscarriage of justice involving faulty technology, is cited as a warning about the potential dangers of relying on algorithmic systems without adequate checks and balances.