Facial Recognition: AI May Help The Police Catch Criminals More Efficiently

Facial recognition technology will change the way law enforcement agencies catch criminals. But it will be anything but a smooth ride.

AI-enabled facial recognition technology might just be coming very near you in the form of police body cameras.

Axon, is the biggest seller of products related to police body camera in the United States of America.

The company convened a special corporate board last Thursday which devoted all its time to the expansion and ethics of technologies such as artificial intelligence.

This is pretty much a major and new step towards companies that will offer controversial (helpful according to some) facial recognition technologies to all law enforcement agencies in the nation.

Most know Axon as the company that made those Taser electroshock weapons which made a lot of headlines a couple of years ago.

The company also makes wearable body cameras.

Most, if not all, major police departments in America now use these.

Axon, as expected, has also voiced its own interest in going after facial recognition technology for all of the company’s body-worn mobile cameras.

The facial recognition technology could enable law enforcement officers to first scan and then recognize the faces of people.

But what kind of people?

Potentially everyone that the law enforcement officers would see while they are on a patrol.

There is a great number (which is growing rapidly) of tech start-ups and surveillance firms which are racing each other to integrate various AI capabilities including facial recognition directly into video in real time.

It is likely that the company’s first board meeting will presage an impending showdown between all technology firms in the industry over the rapidly developing facial recognition technology.

A short time after the company announced its board, a large group consisting of,

  • 42 civil rights organizations
  • Privacy groups
  • Technology groups

such as American Civil Liberties Union along with the NAACP got together and sent their members a letter.

The letter voiced various serious concerns with Axon product development and the current direction it had taken.

The group’s letter also urged related agencies in the country to outright ban AI technologies related to facial recognition.


Because, according to the group, the AI-enabled facial recognition technology represented something that everyone should consider as unethical to deploy.

The group also took the opportunity to explain facial recognition technology’s implications regarding privacy.

Moreover, the facial recognition technology also came with other technical imperfections along with biases which could prove potentially life-threatening.

Recent research regarding the subject has found that the majority of all facial recognition systems perform with way less accuracy when it comes to assessing people who have darker skin color.

This fault could open up the potential of an AI-enabled facial recognition-using officer to misidentify a given person (who would be innocent) as a fugitive who could be dangerous.

Rick Smith, who is the chief executive and founder of Axon, said that his company had not currently started to build AI-enabled facial recognition systems.

However, he mentioned that the company had the facial recognition technology directly under active consideration.

Rick also acknowledged the fact that AI-enabled facial recognition technology did come with flaws regarding potential misuse and bias.

But he said that AI-enabled facial recognition technology offers too many potential benefits which had proven themselves as too promising for a company like Axon to ignore.

Smith also said that he did not think not using AI-enabled facial recognition technology would form the optimal solution in this situation.

He said that the world in which people lived today, where law enforcement officers had to catch dangerous people quickly, AI-enabled facial recognition technology could get rid of the random chance of police officers arresting someone innocent.

In other words, we should not leave catching dangerous people up to random chance.

He also said that people should not expect police officers to really remember, in detail, who they are supposed to look for.

Smith further added that he considered it both counterproductive and naive when people said that law enforcement agencies should not have any access to the new AI-enabled technologies.

He said that law enforcement agencies were going to have access to these technologies and, according to him, law enforcement agencies really needed them.

Smith also noted that the country could not afford a police service in the 2020s that tried to police with 1990s technologies.

As mentioned before, Axon held its very first board meeting this Thursday morning.

The meeting took place at the company’s Arizona headquarters.

Axon had also invited eight AI experts along with experts from fields such as criminal justice and civil liberties.

On a side note, all of them were company-selected.


The company’s board consists of members who don’t have the privileges of veto power in any official capacity.

Most are paid volunteers in fact.

Axon also released a press statement.

The statemented mentions that the board would (as asked) advise the company regarding various topics.

Most of the topics will have to do with the future capabilities of the company’s AI research team which is as a matter of fact working on projects to help police efficacy and efficiency.

AI-enabled facial recognition technologies have long held a good amount of appeal for government surveillance and law enforcement agencies.

Moreover, all the recent advances in the field of AI development along with declining hardware and camera costs have in reality spurred various developers to come with suggestions where the facial recognition technology could come in handy.

One of the ways to make sure that they are taking full advantage of the advanced in the field is by applying the technology in broader use.

A Georgetown Law School research group estimated back in the year 2016 that around 117 million US adults, or roughly about half the total population of the country, had a record in the huge facial recognition databases which federal, state and local law enforcement agencies used.

Some consider attributes belonging to the face as a reliable and quick method to identify anyone and everyone from afar or video.

In fact, in some of the cases, experts think that it is in truth easier to acquire face attributes than other kinds of biometric identifiers.

What are these biometric identifiers?

These are identifiers such as fingerprints and the rest.
The problem with these biometric identifiers is that they demand proximity along with physical contact.

Already the Department of Homeland Security has systems in place which scan the faces of all international travelers.

The department has installed these systems at several of the country’s busiest airport facilities.

Additionally, the Department of Homeland Security also has made plans to expand these facial recognition systems to every traveler who is flying overseas.

Facial recognition technology has a lot of critics.

These critics are of the opinion that facial recognition technology systems still haven’t’ done enough to prove that they have the ability to uniquely identify a person.

The interesting thing about faces is that they change.

Faces are constantly aging over longer periods of time.

Moreover, faces also change because of the person’s circumstance.

Plus, unlike fingerprints, faces aren’t always perfectly unique.

Take the case of identical twins.

Researchers have shown that the case of identical twins can fool AI-enabled facial recognition systems that the likes of Apple’s iPhone X use to unlock smartphones.

According to the letter sent out by the dissenting groups, real-time facial recognition technology would, if truth be told, chill the country’s constitutional freedoms related to association and speech.

The letter also says that these technologies could prove problematic especially at events like political protests.

Moreover, AI-enabled facial recognition technologies could also assist in priming officers to perceive certain people as considerably more dangerous than they actually might be.

Other faults could lead to officers using more force than the real-time situation may require.

The letter also says that no kind of policy and/or safeguard could possibly mitigate these types of risks in a sufficient manner to make real-time AI-enabled facial recognition technology to ever become marketable.

Axon, as a company, has moved very quickly and aggressively to successfully corner the current market on various kinds of police technologies.

The company offers a free 12 month trials for all its body camera equipment.

It also provides online storage facilities to various police departments throughout the nation.

Axon made a statement back in February that according to the company’s estimates, more than 50 percent of all law enforcement agencies of major cities in the United States of America had bought the company’s software and/or body cameras.

These police departments included those from,

  • Washington
  • Chicago
  • Los Angeles

Axon, as a matter of fact, rebranded itself by changing its name a year ago.

Before Axon, the company’s official name was Taser International.

facial recognition

Now, the company has advertised itself as the biggest custodian of data related to public safety in the United States of America.

Axon also claims that it has more than a total of 20 petabytes (on conversion, this comes down to 20 million gigabytes) of police body-camera video, photos and other types of criminal investigation documents.

The company has uploaded all this data to its proprietary cloud-storage service.

Axon calls its cloud storage service as Evidence.com.

Many see policy video as one of the major growth markets as far as AI-development firms go.

It can provide business to the companies involved in AI-enabled police equipment for after-crime review as well as real-time surveillance.

There is this one other company called BriefCam.

This company enables city police officials and investigators to quickly narrow down many hours of video content down to just a few seconds by its content filtering features.

Users can filter content, for example, by only wanting to see footage related to men with briefcases or red trucks.

Axon has managed to establish long-term contracts with police forces nationwide.

And that could force the company to further push AI-enabled technologies directly to real-world deployment phase with rapid forward momentum.

Police departments may forego signing big new deals with various tech firms and continue to push various facial recognition features to all their officers with Axon body cameras.

This process could quickly become similar to how normal internet users apply updates to their software applications.

Facial recognition technologies form the most hotly debated and competitive subset of artificial intelligence in the whole consumer tech market today.

Big technology companies such as Google, Facebook, and Apple have devoted entire teams to expand AI-enabled features for use in photo tagging, security and search.

The majority of the facial recognition systems in place today depend on algorithms.

Or deep-learning algorithms more specifically.

These deep-learning algorithms have the capability to analyze different facial photos.

Then they use their ability to scan for various kinds of similarities across a massive data set of alike images.

Just like with any other controversial technology, facial recognition also has a significant support community.

The supporters say that body cameras and their upgraded systems could assist law enforcement officers.


By alerting them to a passing criminal suspect.

Facial recognition technology could also help police officers to spot a particular missing child in a big crowd.

As alluded to before, the facial recognition technology still doesn’t have the ability to deliver near-to-perfect results.

Instead of identifying a suspect, the current facial recognition technology only suggests the related probability of a person being a possible match.

To put it another way, current facial recognition technology has an accuracy rate.

And that accuracy rate tends to vary wildly.

It varies based on the given photo’s or video’s quality.

In fact, it even varies based on the color of the given person.

Of course, facial recognition technology depends on many other factors as well.

Perhaps these form some of the reasons that privacy advocates worry about.

They say that facial recognition systems could possibly instill a false sense of confidence among police officials.

This could lead to law enforcement officials to misidentify innocent people as wanted criminals or suspects.

The results could also be (potentially) fatal.

police officers

According to the deputy director of Georgetown Law Center of Privacy and Technology, Laura Moy, no one could deny the possibility of error in the technology.

She said that in real-time situations where a given police officer carried arms, the actual risks which were associated with cases of potential misidentification were always prone to exceeding any related (and possible) benefits.

Laura further mentions that there was a real and tangible concern that facial recognition technology could exacerbate the undeniable risks involved with use of force from the police.

The other problem is that today’s AI-enabled facial recognition systems have consistently shown troubling implicit biases.

But why is that?

There is no one single reason.

But often it is due to the scarcity of diversity in all the images that the system’s algorithms have trained on.

Massachusetts Institute of Technology’s Media Lab researchers recently mentioned a couple of months ago that the three most advanced and leading facial recognition systems from companies such as,

  • Face++
  • IBM
  • Microsoft

had managed to perform regularly better when it came to identifying the gender of the people who had a lighter skin tone.

More specifically, the facial recognition systems averaged around 99 percent accuracy for men with lighter skin tone.

For women with darker skin tones, the facial recognition system only managed a 70 percent accuracy.

One also has to remember why body camera gained so much popularity.

Body cameras shot to fame in recent years because people saw them as a checking tool for possible police misconduct.

Now, people are criticizing body cameras for contributing its due share in pervasive surveillance.

Some also believe that body cameras have potentially worsened all the related problems in neighborhoods with heavy police duty.

As far as rules of use go, that is something the police gets to decide as well.

Just last month, police officers in Sacramento conveniently muted their body cameras.

That happened after they had fatally shot an unarmed black man, Stephen Clark.

They shot him while he was in the backyard of his grandmother’s house.

Critics of Axon’s newly-formed board have questioned the effectiveness of the ethics board when it only consists of volunteers.

They have also pointed out that how such a board which only meets twice every 12 months could prove effective steering all the important decisions of Axon, a private company.

The company’s CEO, Rick Smith, also said that he did see some significant parallels between Tasers and facial recognition technology.

Tasers saw quite a lot of initial resistance as well.

But they have managed to proliferate very rapidly.

In fact, tasers have become one of the most commonly used non-lethal weapons among law enforcement agencies.

He mentioned that the company expects to come across some missteps along the way.

But, as he tried to look back when Tasers started their journey, whenever a company tries to introduce things that change a lot of other things, it always comes across problems.

In other words, he did not expect the adoption of facial recognition technology as a smooth process.

He also said that if companies did get something wrong, then that would mean bad things not just for the society but also the companies.

Companies who are coming up with products for law enforcement agencies will pay a huge price for their mistakes.

According to Smith, these company should not want to create an Orwellian state to achieve nothing but some quick money.


Zohair A. Zohair is currently a content crafter at Security Gladiators and has been involved in the technology industry for more than a decade. He is an engineer by training and, naturally, likes to help people solve their tech related problems. When he is not writing, he can usually be found practicing his free-kicks in the ground beside his house.
Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.