Ready To Fight

On Your Behalf

  1. Home
  2.  » 
  3. Criminal Defense
  4.  » Why computerized predictive policing is still racially biased 

Why computerized predictive policing is still racially biased 

| Mar 8, 2021 | Criminal Defense

Predictive policing is a relatively new system that has been developed as computer technology advanced. It is intended to harness the power of data to determine where a crime is likely. The computer can sort through this data quickly and give recommendations about where officers should go, even if a crime has not happened yet. 

For instance, if there is a DUI arrest every weekend in a specific neighborhood, the computer could send officers to that neighborhood on evenings during the weekend, when these things are expected to occur. 

The unbiased goal is hard to reach

The goal of predictive policing is, in part, to remove bias. A computer tells police where crime is likely, so no one can say that a biased officer profiled people in a certain area and assumed they were criminals. It could help reduce things like racial profiling. 

Unfortunately, it doesn’t work. The problem is that all computers need initial data to feed into the algorithm. The computer can’t obtain this in a vacuum. It has to get the information from officers who go out and make arrests. 

The result is that biased officers can feed data that reflects biased policing into the computer. The system itself may not be biased, but it is still going to make recommendations based on the information at hand. If people are racially profiled and arrested in a specific neighborhood, the computer is going to reflect this and predict more crime in that area. 

Have you been profiled?

As you can see, there is no perfect system for predictive policing at this time. If you have been racially profiled and arrested, you need to know your rights. An experienced criminal defense attorney can protect your rights, help you understand the charges you’re facing and guide you through your choices.