The PSC news-insights: entry

01/09/2020
Digital

Let’s talk about algorithms

We’re in danger of scaring ourselves out of progress, because we aren’t having an open and honest discussion about algorithms. Let's discuss how we can use them sustainably.

Councils across the UK are quietly cutting their use of computer algorithms due to concerns about transparency and negative bias, according to the Guardian earlier this week. Teamed up with the furore around A-Level results, use of algorithms in public services are, very understandably, getting a bad name at the moment.

Let’s talk about algorithms

Algorithms don’t tend to hit the news much, but they have silently become a fundamental part of public service provision in recent years – particularly in the criminal justice system, Department for Work and Pensions, Universal Credit, and in HMRC.

The figures really back this up. From some analysis of government spending at The PSC, we found that government procurement of technical support to apply algorithms has increased by an average of 20% each year since 2017.

Algorithms are clearly here to stay, and in a lot of cases when used properly, can be valuable to help government provide consistent and timely access to public services.

So given the rarity of public interest in algorithms, let’s not use this opportunity to entirely demonise them. Let’s seize the chance to demand more transparency about where algorithms govern our lives, and importantly, look at what we can do to make sure they’re used in an ethical way.

 

What can we do to create a more sustainable approach to algorithms in the public sector?

Our right to explanation

The idea that we have a right to explanation in cases where algorithms are making life-altering decisions seems fundamental. It’s built into the legal system in France, for instance. To harness the benefits of algorithms, whilst protecting ourselves against some inevitable negatives, our right to question them should become commonplace.

Encourage public debate

To encourage public trust and engagement with widespread use of algorithms in public services, there must be deliberative campaigns covering these issues. In health care, such as in the OneLondon programme which joins up patient data across health and care, wide-scale, public engagement events have been used successfully. There's no reason we can't do the same with algorithms. 

 

These are just two ways that we can try to reduce the shroud of mystery and suspicion that currently pervades coversation about algorithms. Strong communication needs to sit at the heart of any plan to move forward with algorithm use across the public sector. Let's not just quietly shy away from the challenge.  

Authors: Antonio Weiss and Rose Payne

Latest News & Insights.

The Next 20 Live: Making AI work in public services - from opportunity to impact

The Next 20 Live: Making AI work in public services - from opportunity to impact

30/04/2026 in Digital, Insights

AI is already shaping public services – but its value depends on how it is…

Improving Cultures of Care to Transform Productivity and Safety

Improving Cultures of Care to Transform Productivity and Safety

29/04/2026 in Transformation

Our newly published white paper, grounded in evidence from 163 inpatient wards,…

The Next 20: What we’ve learned so far – public value design, innovation, and the next steps

The Next 20: What we’ve learned so far – public value design, innovation, and the next steps

23/04/2026 in Insights

After eight episodes of The Next 20, we reflect on the themes shaping public…

View More

Subscribe to our Newsletter

I'm interested in...




By submitting your details you are agreeing for us to send you emails we think you might find interesting. We will never share your details with anyone else, and you can unsubscribe at any time.

We will not collect any personal data when you browse this site.

We’d like to collect Analytics Cookies to improve our site. These will only be collected if you click Accept. For more information and to change your preferences please see our Privacy & Cookies policy.

Accept