The Pros and Cons of Predictive Policing

21 | If you had a magic computer that could predict when crimes would take place, would you use it? Would you share it with the police so they could prevent people from becoming victims? Well, believe it or not, the police have such a system and have been exercising it to great effect. The technology and processes behind it is called “Predictive Policing” and it is a very controversial trend  spreading across the USA. I discuss the good and the bad of this, in this episode. | Click here to support my Starbucks habit and financially support this podcast.

Subscribe to this podcast via your favorite podcast platform!

About the host:

Over the past decade, Jim Stroud has built an expertise in sourcing and recruiting strategy, public speaking, lead generation, video production, podcasting, online research, competitive intelligence, online community management and training. He has consulted for such companies as Microsoft, Google, MCI, Siemens, Bernard Hodes Group and a host of startup companies. During his tenure with Randstad Sourceright, he alleviated the recruitment headaches of their clients worldwide as their Global Head of Sourcing and Recruiting Strategy.  He now serves ClickIQ as its VP, Product Evangelist.


Hi, I’m Jim Stroud and this is my podcast.

One of my favorite TV shows of all time is – “Person of Interest.” Do you know that show? In it, a man invents a sentient artificial intelligence system then uses it to protect people before crimes are committed against them. This is how the show begins…

What would you say are the odds of a machine like that existing today? Hah! I think the odds are… Very good. I’ll explain after this…

{sponsor message12 DuckDuckGo Search Tips You Should Know to Boost Productivity}

On the tv show – Person of Interest, a team of people operating outside of the law, protect American citizens with the help of a sentient artificial intelligence system. Although I cannot say with 100% surety that such a machine exists, I think the basic building blocks of such device are already in play. And those blocks make up the trend of “Predictive Policing,” which is technology and processes that stop crimes, likely to occur, before they happen.  Let me quote a few articles to prove my point.

The New York Times says this, quote…

Mr. Brown, whose criminal record includes drug and assault charges, is at the center of an experiment taking place in dozens of police departments across the country, one in which the authorities have turned to complex computer algorithms to try to pinpoint the people most likely to be involved in future violent crimes — as either predator or prey. The goal is to do all they can to prevent the crime from happening.

The strategy, known as predictive policing, combines elements of traditional policing, like increased attention to crime “hot spots” and close monitoring of recent parolees. But it often also uses other data, including information about friendships, social media activity and drug use, to identify “hot people” and aid the authorities in forecasting crime.  

Business Insider has this to say, quote…

Predictive policework is rooted in complex mathematical models, but the basic premise is actually quite simple. A foundational paper on modeling crime compares crime to earthquakes to explain the rationale.

Just as earthquakes tend to lead to more earthquakes nearby and in the near future, gang retaliations, serial offenders, and repeated burglaries on a single location tend to create clusters of criminal offences that, with the right algorithms, police can forecast.

Previous predictive policing methods primarily focused on finding locations where crimes were likely to occur. A report from the nonprofit RAND Corporation, however, suggests that predictive policing can help forsee not only the location, but the times of crimes as well as individuals likely to commit future offenses. It can even predict those likely to be victims of crimes.  

And here’s one more quote; this time from Digital Trends Quote…

Cortica, an Israeli company with deep roots in security and AI research, recently formed a partnership in India with Best Group to analyze the terabytes of data streaming from CCTV cameras in public areas. One of the goals is to improve safety in public places, such as city streets, bus stops, and train stations.

It’s already common for law enforcement in cities like London and New York to employ facial recognition and license plate matching as part of their video camera surveillance. But Cortica’s AI promises to take it much further by looking for “behavioral anomalies” that signal someone is about to commit a violent crime.

The software is based on the type of military and government security screening systems that try to identify terrorists by monitoring people in real-time, looking for so-called micro-expressions — minuscule twitches or mannerisms that can belie a person’s nefarious intentions. Such telltale signs are so small they can elude an experienced detective but not the unblinking eye of AI.

The intent of all this surveillance is to protect the public and make everyone feel safe. And I applaud any and all in the security services who do what they do so I can sleep soundly at night and live another day; but, what is the cost of all this security? I don’t mean cost in dollars. I am referring to the psychological penalties. I also point to loss of liberty. Let me give you something to think about. For one, constant surveillance makes people alter their behavior and not just their bad behavior.

This quote from CJFECanadian Journalists for Free Expression. Quote…

So, how does mass surveillance affect the way we act? A 2016 study showed that people alter their behavior when they are reminded that the government is watching their activities. To test the effects of surveillance, participants in the study were first shown a fictional news headline about the United States targeting the Islamic State in an airstrike. They were then asked how they felt about the event while being regularly reminded that their responses were being monitored. As a result, most people in the study began to suppress opinions about the fictional event that they felt to be controversial or that they believed may lead to the government to scrutinize them.

Interestingly, the study also showed that participants who support the idea of mass surveillance were the most likely to suppress their own non-conformist opinions. | End Quote…

I very much like the idea of Predictive Policing. I especially like it when I hear of companies like PredPol which uses big data and machine learning to predict where crime will take place. One success the company highlights is a 22 percent drop in residential burglaries in Tacoma, Washington thanks in part to their technology.

And yet, I all too often hear a dissenting voice against such technologies because they are purported to be faulty and biased. Listen to this quote from The Register. Quote…  

American police and the judiciary are increasingly relying on software to catch, prosecute and sentence criminal suspects, but the code is untested, unavailable to suspects’ defense teams, and in some cases provably biased.

In a presentation at the DEF CON hacking conference in Las Vegas, delegates were given the example of the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) system, which is used by trial judges to decide sentencing times and parole guidelines.

“The company behind COMPAS acknowledges gender is a factor in its decision-making process and that, as men are more likely to be [repeat offenders] recidivists, so they are less likely to be recommended for probation,” explained Jerome Greco, digital forensics staff attorney for the Legal Aid Society.

“Women [are] thus more likely to get probation, and there are higher sentences for men. We don’t know how the data is swaying it or how significant gender is. The company is hiding behind trade secrets legislation to stop the code being checked.”

These so-called advanced systems are often trained on biased data sets, he said. Facial recognition software is often trained on data sets filled with predominantly white men, he said, making it less effective at correctly matching up people of color, according to research by academics.

So, where do I stand on all this? I confess to being a bit conflicted. Surveillance is necessary to keep us safe but it can go too far, such as with the social credit score happening in China. Maybe I would feel better about the government surveillance if there was an impartial auditor charged with monitoring these algorithms for fairness, an appeals process should someone be falsely accused of a crime by some machine and put a human in the loop. In other words, include a human being in all of this. For example, say some software on a CCTV camera identifies someone as a terrorist. Before the FBI is called in, a human being has to look at the footage and confirm that the person is who the machine thinks they are. I don’t know. Maybe they are already doing that. I hope so.

If you love what you heard, hate what you heard or, don’t know what you just heard, I want to know about it. You can reach me at my website – In addition to finding source material and related information for this podcast episode, you’ll find other goodies that I hope will make you smile. Oh, before I go, please financially support this podcast with a little somethin’-somethin’ in my virtual tip jar. (There’s a link in the podcast description.) Your generosity encourages me to keep this podcast train chugging down the track. Whoot-whoot, whoot-whoot, whoot-whoot…

Music related to this episode:

► Music Credit: Dj Quads Track Name: “A Coffee To Go” Music By: Dj Quads @ Original upload HERE –… • Music promoted by NCM:

Music provided by Juice for Island Boy Productions TRACK: Juice – “Lean” DOWNLOAD/STREAM:…

♫Music by Dj Quads♫ ● @aka-dj-quads ● ● ● ● Download Link:

Soundcloud –

Dj Quads – Lost In Memories
Download Link:

Follow me on Social Media: