People with disabilities, those living in poverty or severe health conditions, are confined to bureaucratic difficulties due to digital exclusion from the Ministry of Work and Pension, Amnesty International International Festival said in a new report.
Report, “Too much technology, not enough empathy” Revealing the ongoing testing of DWP, the launch and roll-up of expensive artificial intelligence (AI) and digital technologies (UC), individual independent payments (PIP), and other social security programs, which create an inaccessible social security system for those who are already marginalized and at risk of poverty in the UK.
Many people who need social security do not have access to digital technology, the Internet, or Internet-connected devices. Their affordability combined with language barriers and longer waiting times for telephone services have resulted in digital exclusion of DWP systems.
“DWP’s mission to reduce ‘cost’ is a fascination and over-dependence on the heart of problematic technology. People are working to make a living and put food on the table due to social security cuts, but DWP is more concerned about experimental techniques that monitor claimants,” Imogen-Richmond Bishop, Imogen-Richmond Bishop, Imogen-Richmond Bishop.
“The systems that support and manage welfare benefits are bringing relentless dehumanization and pressure to those who are already struggling to get basic needs in broken systems.”
The study is an extension of Amnesty International’s 2025 report “Social Insecurity: The Devastating Human Rights Impact of the Failure of the UK’s Social Security System,” which details how the UK’s social security system requires overhaul of human rights in compliance with human rights and ensuring good living standards to make it re-overhaul. The struggle to obtain adequate social security payments to prevent poverty is intersected and complex, and technology forms part of a wider social support ecosystem.
The systems of technology that demand and manage welfare benefits are providing ruthless dehumanization and stress for those who are already struggling in wrestling.
Imogen-Richmond Bishop, Researcher in Technology, Economic, Social and Cultural Rights
Both surveys were conducted from questionnaires, focus group interviews with social security recipients and social security consultants, and based on previous work done by civil society. From October 2024 to January 2025, a total of 782 views were captured.
The perfect storm of existing flaws and new problems
After years of austerity created a perfect storm, the use of digital technologies combined with further cuts in the UK’s social security system, in which preexisting flaws are exacerbating and creating new problems related to these new technologies.
Automated systems and in the case of social security assessment and provision of social security assessment and provision, due to biased or discriminatory algorithms, may cause significant risks of decision-making errors and have serious consequences for claimants.
Because of a person’s living conditions, education, health status, and income levels – automated social security systems are not always fully captured by complex factors, so digital exclusion can be experienced.
One of the claimants interviewed by Amnesty International, Gender and Socioeconomic Status, all represent barriers to her access to services online.
“You know, there’s some form of compassion, you know, making forms and things easier. I mean, I’m illiterate. I mean, a lot of women and men of my age can’t use them. […] So they were stuffed. They sent me letters on my phone. I can’t open them. So I rang. I can’t open it. I don’t have an iPad. I know, I can’t afford an iPad,” the claimant told Amnesty International.
The impact of human rights
Digital and clear data collection also creates a comprehensive social security system that affects claimants’ privacy rights, data protection and human dignity.
Using a lot of data to determine eligibility for state support is nothing new. However, the scale and breadth of the data used and the speed at which it is processed are new and can bring new unexpected consequences and human rights risks.
“DWP experiments on technical systems endanger human rights and reduce people who need data points. The success of claims may depend on whether they fit neatly into the box or meet set standards rather than actual qualifications. In this case, technology oversimplifies the complex reality of people, especially when they cannot get support from people who are supported by humans, who were once humans.
Amnesty International wrote to DWP before the report was released and provided a comprehensive summary of the study results and methods. DWP declined to comment on the substance of the report.
The UK authorities must conduct an independent and impartial review of the social security system and the digital systems used by the DWP and cancel any person who violates human rights. We need laws to regulate AI to ensure it does not cause human rights violations. Digital systems must be transparent, interpretable, and never be mandatory.
background:
May 2025, Amnesty International “Social Insecurity” The report exposed how the cuts in the UK’s social security system, sanctions and systemic failures have led people to deeper into poverty.
Amnesty International also conducted research on automation and digitalization of the public sector Denmark,,,,, Netherlands,,,,, India,,,,, Serbiaand support the work of France and Sweden to the human rights risks and impacts arising from algorithmic decision-making in these jurisdictions.