Thank you – When I took up this role last March I hadn’t appreciated the enormous breadth and depth of this subject matter – if I had, I’d have asked for more money.
The challenges in biometric surveillance are vast and the speed at which they’re arriving is dizzying. My statutory bit is only policing and local authorities and that’s big enough. In trying to make some sense of the issues I’ve found it helpful to consider them from 3 viewpoints: - the Technological – what’s possible; the Legal – what’s permissible, and the Societal – what’s acceptable. This event is very much focused towards the technological – what CAN be done – but unless you deal with the other bits – particularly what the public will support or even tolerate – your market will be pretty volatile.
Today I thought I’d start with a look at the changing context in which the surveillance of public space is being played out – sometimes loudly and controversially, even emotionally – and highlight some key developments that we’re witnessing - maybe even find some evidence for what I believe is needed in the future. I should say at the outset that these are my own views and opinions, I’m not responsible for the policy of the government – something for which I suspect we’re both grateful.
Let’s start with an observation -
Public space surveillance is no longer about where you put your camera – it’s about the purposes to which you’re going to put the billions of available images captured on anybody and everybody’s camera
Keep that in mind as we go through the slides and I’ll explain why I believe that has changed everything we thought we knew about surveillance
In public-cameras-to-people, London is now ranked the 3rd most surveilled city on Earth (73.5 cameras per 1,000 population); in cameras-per-square mile, that’s nearly 1200. raising the Capital to 2nd place. Add in mobile platforms like drones and wearables and it gets more speculative -- when commercial systems watching our transport hubs and shopping centres and other public places are factored in, it’s beyond anyone’s guess but the number will have increased since I put up the first slide.
Take just one surveillance system, our national Automated Number Plate Recognition system (ANPR). ANPR already claims more hits-per-second than Instagram
On its current trajectory it will reach 100m reads per day by 2023 - That’s 1Bn reads in under 2 weeks
You may think a Misread rate of just .001 is pretty good for most surveillance detection systems – for ANPR that = 100k per day
Those numbers are truly astronomical BUT what is equally stellar is the fact that ANPR has no legal underpinning, no dedicated oversight body and is essential to many agencies other than the police. And it’s not just reading your number plate - I’m not sure how aware the public are about any of this.
But they should be – for a few reasons.
One reason is because, just like the tech, the surveillance relationship with the public is also changing fast – for example, policing no longer relies solely on images of the citizen - but increasingly on images from the citizen. We know that people are more willing to share their content in some situations - or for some purposes - than others - When that sharing becomes a critical dependency this is going to matter - a lot
Why? Again there are a number of reasons - the development of surveillance is systematic which means a systemic approach is needed - if one part of the system surveilling public space uses what people think are untested technology, or untrusted processes or unethical partners – the citizen may be less inclined to help when another part of the system needs them
Another reason is because some citizen-generated data may be unreliable, deliberately misleading or even maliciously shared.
And relationships aren’t just one-way - The police themselves are under greater surveillance globally – I’m not just talking about the trend in police activity becoming street theatre for social media – I mean the much more sinister use of surveillance technology to identify individual officers
The use of Facial Recognition capability BY the police has attracted a lot of attention and controversy - but police surveillance runs both ways too. BBC research into Hacktivism and doxing of police records shows how cyber activists in Belarus managed to get photos from officers’ personal files and then ran facial recognition comparisons against internet images of those same officers reportedly beating protestors –the hackers then identified the officers and revealed where they lived. This story also illustrates some other risks of Internet scraping and piecing together OSINT. The citizen now has access to surveillance tools that only a decade ago were restricted to state agencies. Last year France even began banning the photographing of police officers for this reason. To do that will have an impact on things like journalism & transparency & accountability – and of course journalists too are very vulnerable to these same citizen surveillance tactics - as are victims and witnesses and so on… Unlike Scotland we don’t class facial images as biometrics and so most of this is unregulated.
In world where public surveillance of faces is totally unregulated, natural visibility may become naked vulnerability; everyone swaps their COVID masks for total face coverings – including the police – and there are no public faces left to recognise.
Do we want our neighbourhood police to look like this Mexican officer [pic] whose greatest vulnerability is him being recognised while he’s at work then killed when off duty?
The more intrusive unregulated surveillance becomes, the greater will be the need to hide the identity of those who believe they’re vulnerable – including the police and public servants - something that would have profound implications for our way of life.
This is of course an extension of the Chilling Effect - and Anyone who still thinks biometric surveillance is just about Data Protection hasn’t been paying attention.
At the same time, there is a clear and legitimate role for the ethical and accountable use of biometric surveillance technology in tackling crime - and that is also changing fast. New capabilities are helping law enforcement agencies and in some areas - like online child sexual exploitation – new technology is probably the only effective solution
Let’s take a closer look then at crime and criminality - In his final report as HM Chief Inspector of Constabulary, Sir Tom Winsor says this:
“Anyone who knowingly and deliberately creates or tolerates the conditions in which crimes are committed and victims are isolated from protection and justice should be given the most potent grounds to fear the criminal law, operated and applied vigorously by the law enforcement institutions of the state.”
It’s hard to disagree with that. And the legitimate role of biometric surveillance in that “vigorous application of the criminal law” is growing – both in its scale and importance.
Tom Winsor goes on to say that policing needs “a material intensification of a partnership with the private sector that is soundly and enduringly based on trust and common interest.”
Far be it from me to embellish the HMCI’s pronouncements - but if you add The Citizen to that statement - to that partnership - you have a near perfect description of what I believe we need in biometric surveillance.
Whether it’s citizen-shared images, ANPR, AI and Machine Learning, Facial Recognition or Rapid DNA Profiling, Strong and Successful partnership with the citizen and the private sector is already critical for biometric surveillance in policing and public services
If those partnerships are NOT “soundly and enduringly based on trust and common interest” we are all in trouble.
Which is one reason why we CANNOT partner with companies involved in the camps spread across Northern China, that are designed, built and operated by state-owned companies for perpetuating the eradication of Uyghur Muslims.
Where’s the evidence that they’re doing this this? The Commons Foreign Affairs Committee is one source - and also the Uyghur Tribunal which found, last December, that “estimated numbers in excess of a million people have been subjected to acts of unconscionable – cruelty - depravity and inhumanity.
The judgment found that “Many of those detained have been tortured - detained men and women have been raped - one was gang raped by policemen in front of an audience of a hundred people all forced to watch [while others] were raped by men- paying to be allowed into the detention centre for the purpose.”
Now, if that isn’t “knowingly and deliberately creating conditions in which crimes are committed and victims isolated from protection and justice” then I don’t know what is.
These places and the police officers who work there rely heavily on biometric surveillance -- state-of-the-art surveillance – designed and operated by state-run surveillance companies that sell their same biometric surveillance systems across the UK. They may be here today.
How could, for example, a police partnership in the UK with a company that designed, built and operated these places be described as “a partnership with the private sector that is soundly and enduringly based on trust and common interest?” as prescribed by Her Majesty’s Chief Inspector of Constabulary? It can’t.
Someone asked me the other day ‘can you imagine a member of the UK Chinese community approaching a police officer for help, only to find themselves looking into the lens of the same body worn device used by officers to do these things to Uyghur Muslims in these places?”
Sadly, I can imagine it. Some people in local authorities tell me that they have to do business with these companies because the government hasn’t banned anyone in the way that the USA has done – or published a list of things that are not allowed. Well, if you need a piece of paper from the government saying why genocide is unethical, I’m sure we can find something. The UN Guiding Principles for Business on Human Rights perhaps.
Look, I get it that ethical Leadership can be difficult, especially when there’s money involved and not everyone’s up to it – but come on. On any view having our police or local councils partnering up with organisations that have this on their CV is simply wrong.
As for the police, in this country policing uses the National Decision-Making Model which – according to the College of Policing - puts ethics at the heart of every decision. Buying surveillance systems from these companies is a decision and it isn’t ethical; nor is expecting officers and staff to wear them.
In my view “those who knowingly and deliberately tolerate conditions in which crimes are committed and victims isolated from protection and justice have no place in partnerships with the police or any other institution of the State.”
MP Alicia Kearns said very recently that 'We urgently need to get a grip on biometric surveillance technologies in this country, and I expect the police to launch an immediate review of surveillance procurement policies to rip out those complicit in slavery and human rights abuses.'
So do I and I’ll be issuing some formal advice to BOTH police and local authorities very shortly.
For the reasons I’ve set out already, the importance of partnerships in biometric surveillance cannot be overstated – they are going to be critical, not just in maintaining public trust and confidence - but also functionally
One of the most compelling practical reasons why we need strong partnerships in this area is the emerging reliance on Aggregated Surveillance Capability
As I said at the start, Public space surveillance is no longer about where you put your camera - Investigations need extracts from high street CCTV but also image captures from dashcams, - GoPros, - ring doorbells, - phones -shedcams, drones and car parks. Increasingly we are dealing with the product of an aggregated surveillance capability made up of infinite sources, almost all of which will be privately owned and operated.
When it needed a human to do the editing and compilation - no one was going to live long enough to pull this data together. That would’ve been like manually - size ordering - every grain of sand in the Sahara.
Thanks to technological advances in video surveillance capability, those grains of data can now - not only be sorted and categorised - they can also be compared with the grains from every other known desert
Let’s look finally at Regulation and implications of all this.
In the context we’ve just seen it makes no sense at all to regulate only the camera systems operated by police and local authorities.
Regulation should reflect the situation you’re trying to regulate - ours doesn’t at the moment
Where‘s surveillance going? Well, the short answer is probably to the Information Commissioner – if it does, it still won’t reflect the situation we’re trying to regulate.
ANPR alone is now so vast - and so vital – that it needs its own regulatory and governance framework
And what to do with Live Facial Recognition is the surveillance Q that simply won’t go away in pretty much every jurisdiction I’ve worked with. That alone should tell us something. And if we think the police using it is controversial, wait until schools begin to install facial recognition capabilities.
Whatever regulatory framework we end up with, I believe that we will need legitimate and ethical partnerships with the private sector, partnerships soundly and enduringly based on trust and common interest - which is why I think events like this one are so important. So thank you for inviting me.
And on that note I’ll leave you if I may with the following Q to take away into the rest of this event:
How far are your surveillance partnerships “soundly based on trust and common interest”, reflecting your own professional values and those of your customers and staff?