Important notice: Beware of scammers pretending to represent InData Labs

How Does Facial Recognition Work?

6 October 2020
Author:
How does facial recognition work

Face recognition is everywhere in tech right now, fueling everything from payments to criminal identification systems. According to Global Market Insights, the face recognition market will reach $12 billion by 2026. Share on X How come the technology has got so popular? Because it adds massive value to both people and businesses. Opening doors with face unlock, diagnosing diseases, preventing crimes, and finding lost pets. What will be the next face recognition’s move?

Ready to implement the technology into your business or curious what’s coming down the pike, let’s take a closer look.

What is Facial Recognition and How Does the Technology Work?

Face recognition is a technology that detects and identifies the human face. How does facial recognition work? Face recognition sees no faces, it sees facial data. Facial data means unique facial features like the distance between eyes, forehead-chine distance, nose width, cheekbone shape and etc. There’s no common answer to how facial recognition software works because every software is based on solid proprietary algorithms. Yet still, we can divide the process into two main stages:

Detection

How does face detection work? Software like face recognition SDK or other facial recognition systems work to detect single or multiple faces on images and videos. What they get is called facial coordinates (eyes, nose, lips, etc). These coordinates are unique.

Recognition

How does face recognition work? Recognition is a way to identify a human face for identification or verification purposes. At this stage, your facial coordinates are compared to a database. If the similarity is detected, the determination is made.

How Accurate and Safe Is Face Recognition?

In a perfect environment, face recognition can identify faces with pinpoint accuracy. According to NIST (the National Institute of Standards and Technology), the accuracy score hits 99.97%. By perfect environment, we mean nearly-perfect lighting conditions, accurate positioning, and facial landmarks of an individual are clear and not occluded. In the real-life-environment, facial recognition algorithms performance is lower. Occluded faces, poses, expressions, and ageing now is the worst face recognition’s nightmare. Ageing, especially. Today, Age invariant face recognition (AIFR) is used in most surveillance apps. Even state-of-the-art systems have difficulty identifying the same human faces many years apart. Scientists from Michigan State University found out that the technology is capable of recognizing faces up to six years later. If it’s more, the accuracy score lowers. The research states how important it is to take photos every five years to prevent wrong matches.

Face recognition safety and security

Source: Unsplash

Another question is, how safe is your data with the technology? Thanks to solid math algorithms and your unique facial data, you have little to worry about. Regular passwords can be easily hacked but not your faceprint. That’s why the technology is taking over banking and eCommerce industries. It streamlines online payments and advances customer experience. What about databases where all the information is stored? To date, security vulnerabilities databases have do not seem to go away. Is there a chance to use facial recognition and not get your data stolen? The question remains unsolved. For now, facial recognition systems are trying to find the balance between safety and privacy.

Bias in Face Recognition

Facial recognition opponents claim that the tech has a problem with bias. Is it really so? To better understand the nature of the problem, we’ve decided to study some hands-on experience. In a 2019 research, the NIST estimated 189 software from 99 developers. The main objective was to understand how the algorithms performed on two tasks. First, the algorithm had to match two different photos of the same person from the database (one-to-one search). In real-life environments, this approach is used for verification purposes like face unlock or smile-to-pay. Second, the algorithm had to determine if the person on the photo has a match in the database (one-to-many search). This approach is used if there’s a need to identify a certain person.

The NIST team came up with two error classes:
false positive (there are different people on the photos but the software showed the same person)

and

false negative (there’s the same person on two photos but the software failed to do the match).

Now let’s get it straight. If the problem occurs in a one-to-one search, it’s a little nothing. Just imagine, you won’t unlock your phone right away or pay for the groceries. But if it’s about one-to-many search, that’s where the trouble begins. The wrong match can warrant a person will be facing close scrutiny. How come, this might happen? The NIST team mentioned the immaturity of algorithms to identify demographic factors. During the study, there were 18.27 million images of 8.49 million people scanned by the algorithms. The photos had data on race or birth country, age, and sex. Let’s take a look at some findings of the study.

one to one match in face recognitionSource: NIST

Analyzing the research, one can say that there are actually methods to tackle bias in facial recognition software. The NIST stated that there might be some connection between the error rate and data used to train algorithms. So, to get better results, there should be more demographic elements considered and more facial data provided.

IBM is also on the lookout for the right tech to combat bias in the technology. They created two datasets to study bias and help cope with it since it’s detrimental for society. The first one trains software to recognize eye color, hair color and facial hair. The second one is a mixture of genders, ages and most importantly, different ethnicities. This, hopefully, will help systems better recognize diverse faces and avoid mistakes.

Face Recognition Use Cases: Overview

As the technology takes over the world, we see that a great number of companies show growing interest in facial recognition. Here are the key use cases for 2020:

  • Replacing fingerprint scanning to combat the spread of the COVID-19
  • Tracking down quarantine violators
  • Ensuring preventative measures in retail stores (face-mask detection, fever detection)
  • Providing a safe return to work (face-mask detection, distance tracking, non-contact temperature detection)
  • Social distancing detection, crowd detection
  • Spotting virus spreaders at airports and train stations
  • Preventing theft in retail stores
  • Finding missing persons
  • Advancing customer service (spotting and welcoming VIPs at hotels, restaurants, venues, etc)
  • Identifying diseases and detecting emotions in mental therapy
  • Tracking school/college attendance
  • Streamlining transactions
  • Providing/banning access to certain areas

face recognition use cases

Source: Shutterstock

Takeaway

Today, people are split on face recognition. While some are thrilled about tech advancements, others are concerned about their privacy. But whether we like it or not, it’s sweeping the world. Tech companies are advancing their facial recognition software to provide better results in far-from-perfect environments. And that’s just a start. 2020 has become a true catalyst for change. In the next years to come, the innovation will be so commonplace, we won’t even notice.

Develop Custom Face Recognition Applications With InData Labs

Inspired to turn your idea into a face recognition-related project? Let’s talk at info@indatalabs.com.

    Subscribe to our newsletter!

    AI and data science news, trends, use cases, and the latest technology insights delivered directly to your inbox.

    By clicking Subscribe, you agree to our Terms of Use and Privacy Policy.