Banning Police Technology: Stop the Madness!

In an attempt to “reimagine” policing, but what feels more like an attempt to neuter police in the United States, various tactics are being deployed; forcing Chiefs to resign, local legislation to defund the department, implementation of review boards comprised of non-practitioners, elimination of qualified immunity, and the dismantling of police technologies.  Contrary to their convenient narrative, these people and groups do NOT want to keep vulnerable communities safe.  Plain and simple.  If they did, they would roundtable and discuss, but no… they don’t pull in the experts and learn and understand, they just demand.  

Focusing on their attempt to dismantle police technologies, I remember first encountering these issues in 2012 when it was a game of whack-a-mole, trying to combat legislation pushed by privacy advocates to thwart the use of license plate reader (LPR) technology.  We would put forth all of the reasons that LPR tech keeps communities and officers safe, share the countless success stories, and luckily break through to a level headed legislator or two.  But, there is a new game in town, and it’s not singularly focused on any one technology.  The technologies have been grouped and are under attack and are now referred to as Algorithmic Policing Technologies.

The narrative being pushed is that these Algorithmic Policing Technologies are biased toward communities of color and that anyone who uses these technologies to aggressively solve crimes doesn’t care about these communities.  I contend that this notion is completely backwards and instead it is the advocates who are most biased with their “psychological projection”.  We are all well aware of the fear mongering tactics of these advocates who compel uninquisitive people into believing they are minutes away from a criminal conspiracy indictment, thus motivating them to donate to their privacy group like Charlie Sheen in a casino strip club to protect them from governmental evil doers.  But c’mon… is any person earning $15,000 a year and leaning on social programs to supplement their lives going to set up a recurring $100 donation to their advocacy group every month?  No.  These advocates are tugging on the heart strings of the wealthy – the ever-so-woke “it feels good to be social justice warriors, but I have never actually done anything” crowd.  I am confident no one from this crowd has ever talked to a mother on how all she wants to do is not hear gunshots every night and hide her children in a bathtub to avoid stray bullets.    

The three types of algorithmic policing technologies under attack are:

  • Location-focused, which uses historical police data to try to predict high crime areas;
  • Person-focused, which uses historical police data to try to predict persons who may be likely to commit crimes; and
  • Algorithmic surveillance policing technologies, which usually have no predictive element but may process police data in a different way.

To demonstrate how out-of-touch legislators and politicians are, the Portland City Council just passed the most stringent facial recognition ban in the country for both Government and commercial use.  They did this while the city is literally burning to the ground.  The Mayor was driven from his home by anarchists and ironically almost simultaneously lauded the decision.  Portland already disbanded its Gun Violence Reduction Team which immediately resulted in dramatic crime increases.  

Note this quote from a recent article

In Portland, the police chief said the City Council’s decision to disband of the Gun Violence Reduction Team is likely a factor in the increased violence as well, though the mayor suggested Thursday the city needs to fund a unit at PPB that focuses on gun crime.

Wait… What??? So, while the council may only care about themselves, they have a duty and responsibility to protect its citizens.  Let’s see who gets re-elected.  And if those involved are re-elected, the residents deserve what they get. That is what a democracy is.  Isn’t there a big city mayor out there touting, “How do you like me now?” while its constituents have massive buyer’s remorse?

Predictive policing is simply unbiased.  LAPD dropped its predictive policing model in April 2020 due to outside pressure, according to some – budgetary reasons, claims LAPD Chief Michael Moore. For the purpose of this article, I define “Information” as something that did in-fact happen, e.g.: A person was robbed at this time, date, location and “Intelligence” as something reported, but not necessarily confirmed, e.g.: Anonymous call that states Johnny is a gang member with the Gangham Style Dancers of 12th St.  Predictive policing providers use information (data), not intelligence and specifically do not use any race or gender data in their algorithms.  In an article in The Atlantic a quote summed it up for me. “Because the data that fed (vendor) algorithms were based on reported crime, [redacted] and other critics argued that the program would lead to aggressive policing in communities of color, where reported crime tends to be high.”  Ah, yeah.  And as soon as the crime stops or displaces, the algorithm will direct the police somewhere else.  Although I don’t like the word, “aggressive”,  I would have suggested to the person quoted and the other critics the following parable known all too well to the police: “I am a police officer, not your parent. I am here today because something in your life happened that you cannot deal with on your own and I am tasked with solving today in minutes something you haven’t been able to solve in your entire life. The way I respond is based entirely on your level of cooperation. Let’s show each other some respect and we can solve this problem together”. That’s not aggressive policing. That is strength and compassion rolled up into one. That is why policing is the toughest job in the world. 

I choose to avoid “person-focused technologies” that try to predict persons who may be likely to commit crimes; or are likely to be victims.  I know it is possible, but I am unaware of any solutions in practice. In fact, I was asked to draft a scoring model in one of my previous roles and we did determine it may be “too much” for the industry.  Think “Brooklyn North Triangle”. Today’s victim, is tomorrow’s witness, is the following day’s perpetrator.  I concede, although accurate and reliable, this is not ready for prime time.  An all too common and unfortunate reality. Hark, a balanced position!  

Finally, algorithmic “surveillance” policing technologies such as facial recognition, license plate readers, closed circuit cameras and social media monitoring all operate on their face as unbiased.  Nowhere in any of those solutions does race, gender or any other defining attribute of a person exist.  An LPR camera reads a plate and determines an action take place; a camera images a face and facial recognition software makes a suggestion (hundreds, maybe thousands) toward pointing to a subject; a person shoots someone with a handgun in the field of view of a CCTV camera and a historical archive pull reveals a suspect; on Twitter and other social platforms keyword searches and alerts determine a possible threat lobbed by an easily identified or anonymous SoMe profile.  Each one of these technologies requires in its capabilities and should mandate in its usage policies a “human in the loop”.  As I noted in a previous article, the right person, the right car, the right door, faster. A human must make the final call.  If there is any racial bias, it’s on the part of a person.  And that is rare and isolated.  Also of note, especially facial recognition, more usage of these technologies means less burden on victims and the community. Anyone who ever asked a robbery victim to look at photos, knows what I am talking about.  

So, I ask again.  Does it really seem like these advocates who are pushing so hard against this technology which helps reduce crime every day in communities of color, are really in it to support those communities?  They are not even listening to these communities – a recent survey reported that 81% of black Americans want police to spend the same or more time in their communities.  Let’s call out the hypocrisy and demand a stop to the false narrative and attempt to ban technologies that are going to make people safer in these vulnerable communities.