Law enforcement tech outfitter Axon has announced that it will include automated license plate recognition in its next generation of dash cams. But its independent ethics board has simultaneously released a report warning of the dire consequences should this technology be deployed irresponsibly.
Axon makes body and dash cams for law enforcement, the platform on which that footage is stored (Evidence.com) and some of the weapons officers use (Taser, the name by which the company was originally known). Fleet 3 is the new model of dash cam, and by recognizing plate numbers will come with the ability to, for example, run requested plates without an officer having to type them in while driving.
The idea of including some kind of image recognition in these products has naturally occurred to them, and indeed there are many situations in law enforcement where such a thing would be useful; automated license plate recognition, or ALPR, is no exception. But the ethical issues involved in this and other forms of image analysis (identifying warrant targets based on body cam footage for instance) are many and serious.
In an effort to earnestly engage with these issues and also to not appear evil and arbitrary (as otherwise it might), Axon last year set up an independent advisory board that would be told of Axon’s plans and ideas and weigh in on them in official reports. Today they issued their second, on the usage of ALPR.
Although I’ll summarize a few of its main findings below, the report actually makes for very interesting reading. The team begins by admitting there is very little information on how police actually use ALPR data, which makes it difficult to say whether it’s a net positive or negative, or whether this or that benefit or risk is currently in play.
That said, the very fact that ALPR use is largely undocumented is evidence in itself of negligence on the part of authorities to understand and limit the potential uses of this technology.
“The unregulated use of ALPRs has exposed millions of people subject to surveillance by law enforcement, and the danger to our basic civil rights is only increasing as the technology is becoming more common,” said Barry Friedman, NYU law professor and member of the ethics board, in a press release. “It is incumbent on companies like Axon to ensure that ALPRs serve the communities who are subject to ALPR usage. This includes guardrails to ensure their use does not compromise civil liberties or worsen existing racial and socioeconomic disparities in the criminal justice system.”
You can see that the ethics board does not pull its punches. It makes a number of recommendations to Axon, and it should come as no surprise that transparency is at the head of them.
Law enforcement agencies should not acquire or use ALPRs without going through an open, transparent, democratic process, with adequate opportunity for genuinely representative public analysis, input, and objection.
Agencies should not deploy ALPRs without a clear use policy. That policy should be made public and should, at a minimum, address the concerns raised in this report.
Vendors, including Axon, should design ALPRs to facilitate transparency about their use, including by incorporating easy ways for agencies to share aggregate and de-identified data. Each agency then should share this data with the community it serves.
And let’s improve security too, please.
Interestingly, the board also makes a suggestion on the part of conscientious objectors to the current draconian scheme of immigration enforcement: “Vendors, including Axon, must provide the option to turn off immigration-related alerts from the National Crime Information Center so that jurisdictions that choose not to participate in federal immigration enforcement can do so.”
There’s an aspect of states’ rights and plenty of other things wrapped up in that, but it’s a serious consideration these days. A system like this shouldn’t be a cat’s paw for the feds.
Axon, for its part, isn’t making any particularly specific promises, partly because the board’s recommendations reach beyond what it is capable of promising. But it did agree that the data collected by its systems will never be sold for commercial purposes. “We believe the data is owned by public safety agencies and the communities they serve, and should not be resold,” said Axon founder and CEO Rick Smith in the same press release.
I asked for Axon’s perspective on the numerous other suggestions made in the report. A company representative said that Axon appreciates the board’s “thoughtful guidance” and agrees with “their overall approach.” More specifically, the statement continued:
In the interest of transparency, both with our law enforcement customers and the communities they serve, we have announced this initiative approximately a year ahead of initial deployments of Axon Fleet 3. This time period will give us the opportunity to define best practices and a model framework for implementation through conversations with leading public safety and civil liberties groups and the Ethics Board. Prior to releasing the product, we will issue a specific and detailed outline of how we are implementing relevant safeguards including items such as data retention and ownership, and creating an ethical framework to help prevent misuse of the technology.
It’s good that this technology is being deployed amidst a discussion of these issues, but the ethics board isn’t The Board, and Axon (let alone its subordinate ethics team) can’t dictate public policy.
This technology is coming, and if the communities most impacted by it and things like it want to protect themselves, or if others want to ensure they are protected, the issues in the report should be carefully considered and brought up as a matter of policy with local governments. That’s where the recommended changes can really start to take root.