The PSC news-insights: entry

01/09/2020
Digital

Let’s talk about algorithms

We’re in danger of scaring ourselves out of progress, because we aren’t having an open and honest discussion about algorithms. Let's discuss how we can use them sustainably.

Councils across the UK are quietly cutting their use of computer algorithms due to concerns about transparency and negative bias, according to the Guardian earlier this week. Teamed up with the furore around A-Level results, use of algorithms in public services are, very understandably, getting a bad name at the moment.

Let’s talk about algorithms

Algorithms don’t tend to hit the news much, but they have silently become a fundamental part of public service provision in recent years – particularly in the criminal justice system, Department for Work and Pensions, Universal Credit, and in HMRC.

The figures really back this up. From some analysis of government spending at The PSC, we found that government procurement of technical support to apply algorithms has increased by an average of 20% each year since 2017.

Algorithms are clearly here to stay, and in a lot of cases when used properly, can be valuable to help government provide consistent and timely access to public services.

So given the rarity of public interest in algorithms, let’s not use this opportunity to entirely demonise them. Let’s seize the chance to demand more transparency about where algorithms govern our lives, and importantly, look at what we can do to make sure they’re used in an ethical way.

 

What can we do to create a more sustainable approach to algorithms in the public sector?

Our right to explanation

The idea that we have a right to explanation in cases where algorithms are making life-altering decisions seems fundamental. It’s built into the legal system in France, for instance. To harness the benefits of algorithms, whilst protecting ourselves against some inevitable negatives, our right to question them should become commonplace.

Encourage public debate

To encourage public trust and engagement with widespread use of algorithms in public services, there must be deliberative campaigns covering these issues. In health care, such as in the OneLondon programme which joins up patient data across health and care, wide-scale, public engagement events have been used successfully. There's no reason we can't do the same with algorithms. 

 

These are just two ways that we can try to reduce the shroud of mystery and suspicion that currently pervades coversation about algorithms. Strong communication needs to sit at the heart of any plan to move forward with algorithm use across the public sector. Let's not just quietly shy away from the challenge.  

Authors: Antonio Weiss and Rose Payne

Latest News & Insights.

Synthetic Research is Happening – Here’s What’s Out There

Synthetic Research is Happening – Here’s What’s Out There

23/12/2025 in Digital, News, Insights

AI-powered synthetic research is moving fast. Here’s how public service…

The PSC and Parkinson’s UK Shortlisted for The 2026 HSJ Partnership Awards

The PSC and Parkinson’s UK Shortlisted for The 2026 HSJ Partnership Awards

15/12/2025 in Strategy, News, Insights

Our Time-Critical Medications Dashboard has received recognition from The 2026…

Fit for the Future: Realising the ‘Left Shift’ from Hospital to Community

Fit for the Future: Realising the ‘Left Shift’ from Hospital to Community

12/12/2025 in Transformation, News, Insights

Insights from The PSC’s roundtable on community health, innovation and system…

View More

Subscribe to our Newsletter

I'm interested in...




By submitting your details you are agreeing for us to send you emails we think you might find interesting. We will never share your details with anyone else, and you can unsubscribe at any time.

We will not collect any personal data when you browse this site.

We’d like to collect Analytics Cookies to improve our site. These will only be collected if you click Accept. For more information and to change your preferences please see our Privacy & Cookies policy.

Accept