
Two major reports into policing from Her Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) and the Police Foundation are critical of the agility of policing to respond to and use technology, Martin Hewitt is rather more optimistic. He explains, “There is no doubt that technology has transformed policing but at the same time it has transformed how crime operates, created new types of crime and made it completely borderless.”
He says that the public is wary of the way that policing uses technology, and it is right that the police utilises it in a controlled and legal way to retain their trust and confidence. “The HMI report is right in that technology has changed our operating environment, but we must be realistic that we are never going to completely keep up.”
While technology moves on apace, there may be a temptation to adopt new approaches and applications, but Martin urges caution. “We won’t ever be cutting edge in our use of technology.” He says that utility must come first and by this he means, if it doesn’t improve the way that policing works, he’s not interested.
Martin refers to a recent report from the House of Lords Justice and Home Affairs Committee. Technology rules? The advent of new technologies in the justice system investigated how artificial intelligence (AI) is being used by the police. The authors express some alarm at what they found, “We were taken aback by the proliferation of artificial intelligence tools potentially being used without proper oversight, particularly by police forces across the country.”
This 95-page report is a compelling read, setting out the challenges faced by policing when it comes to making informed decisions about new technological approaches to solving problems. Martin says that there are certainly benefits for reducing the amount of time taken on processes like looking through thousands of hours of CCTV images to find ‘the man with the red trousers’, as he puts it.
Using AI to process large amounts of data to find the proverbial needle in a haystack appears on the face of it to be uncontroversial, but when it comes to using AI for prediction, the landscape changes. Martin explains, “Once you start using technology to predict outcomes, you have to think about ethics and bias and how we manage that.” It’s an area that policing is looking at, but it is early days.
The NPCC has a National Police Ethics Group led by CC Richard Lewis from Dyfed-Powys Police and within it there is specific activity on digital and data ethics. Martin explains, “At the moment we are focused on the ethical use of cameras, but we are moving on to think about data more generally and especially in how we use algorithms in policing.”
The algorithms in technology – the rules by which they operate – can bake in bias and understanding what the rules are and what that bias might be, are important considerations for policing when it is looking at using AI to predict outcomes. There is early work going on in the City of Amsterdam to create a register of algorithms used in public sector applications; by setting out the rules, the approach and by being transparent, the public can see why decisions are being made.
Martin says there is some interest in this approach in UK policing and talks about the work of the NPCC’s Chief Scientific Advisor, Dr Paul Taylor whose role is to push forward innovation in this space. Later in the summer, the NPCC will look to approve the Science and Technology Strategy and there is parallel work going on to produce an AI strategy as well.
Martin is enthusiastic about the Home Office led Accelerated Capability Environment (ACE) where it has created an Impact Lab to quickly solve digital and data problems faced by policing. He says, “This a structured and dynamic way of bringing the problems that police face using live examples.”
In the last Impact Lab over 40 businesses, academics, police came together to look at how to improve investigations into rape and serious sexual offences. Martin explained that businesses get to work with the real data to work up solutions to the problems, testing and iterating them before returning to make a Dragon’s Den-style pitch to the commissioning police forces.
He thinks that improving how policing uses technology like AI, requires a shift in thinking in its relationships with the private sector and that police needs to trust its commercial partners more. The Impact Lab approach is one way to build this trust and Martin is optimistic about the benefits this will bring in the long term.
This seems ripe for a multi-agency approach, where there are common technology problems, but Martin says that complexity may be an inhibitor here. He highlights how the response to COVID over the last two years meant that police worked much more closely with local partners in the Local Resilience Forum. “We’ve got a better understanding now of each other’s demands and capabilities. There are some common challenges and questions.”
Looking to the future, Martin uses the phrase ‘lawfully audacious’ and expands on this point. “We need to be audacious. We know that technology is there and can help us keep people safe; we’ve got to do this in a lawful, thoughtful and considered way so that we can do the protecting, but we don’t lose public trust and confidence along the way.” www.npcc.police.uk www.emergencyservicestimes.com June2022 A utility first approach to using technology in policing In the first of a new series, Catherine Levin talks to strategic leaders about key issues facing them. Starting with the Chair of the National Police Chiefs’ Council, Martin Hewitt, they explore the opportunities that technology brings to improve policing in the UK. Words: Catherine Levin, Editor, Emergency Services Times NPCC Chair, Martin Hewitt.
This was first published in the June issue of Emergency Services Times in new feature called Perspective.