Your Digital Body: Bias and Biometrics in Tech

Your Digital Body: Bias and Biometrics in Tech

What can possibly be more uniquely you than your physical body? Fingerprints, iris patterns, voice… These (and more) are what biometrics is all about. Biometrics are generally used in two ways: to determine if a person is who they claim to be or to find out who a person is by searching a database.

Believe it or not, the use of biometrics as a means of identification has been documented back nearly 4,000 years ago, when Babylonians used fingerprints to sign contracts. Since then, fingerprints and other forms of unique physiological data have been used to identify individuals for a variety of reasons, such as to identify criminals or to authorize access to a resource like a document or a physical location. With modern computers (including smartphones and tablets), it is easier than ever to automatically take a complex image and compare it in detail to an image on record. Sounds pretty much perfect… or does it?

Bias in Biometrics

True story: a friend of mine tried to use one of the automated passport scanners to get through customs. It kept telling him that it didn’t see his face for a photo, even though he could clearly see himself on the screen. The officer minding the queue walked over and put a piece of paper over my friend’s head to add just a bit of shade, and suddenly everything worked. My friend is white and bald, and the glare from the overhead lighting on his head tricked the software into thinking his face wasn’t there. 

There are lots of stories out there about biometric-based services unable to handle dark skin tones, light skin tones, congenital disabilities, transgender faces, and so on. Much of the earlier modern bias resulted from the use of very limited datasets that contained more of one type of image than any other. For example, a database might have thousands of fingerprints to use for testing purposes, but they would include just one skin tone for the hands. Or a database might have a million faces, but mostly from middle-aged white males. Anyone that fell outside the parameters of what the developer tested for might be identified as “not white,” but uniquely identifying them would fail. 

The good news is that, at least when it comes to biased datasets causing havoc, there are efforts to improve the space. This has been a particularly important area of research for Artificial Intelligence and Machine Learning (AI/ML) and organizations like the Organisation for Economic Cooperation and Development (more commonly known as the OECD; think high-powered, treaty-based, international organization) have guidance for how these types of systems should be designed. Microsoft, a company that does quite a bit with AI, has some pretty extensive guidelines and governance as well. So there is hope and some established guidelines out there. These guidelines, if followed by everyone, would greatly improve the computer bias in this space.

Other weaknesses of biometrics

If everything works as intended, biometrics are a great way to uniquely identify yourself to a computer. As long as no one does something gruesome to steal a body part, it’s all you, right? Well, sort of. The problem is that yes, your fingerprint (or face print, or iris pattern, or whatever) is on your physical body, but as soon as you scan it to be used for identification and access, it becomes an electronic asset. And as we all know, electronic assets can be hacked and stolen. Some call this the ‘fatal flaw’ of biometrics.

Back in 2019, a data breach of a company called Suprema exposed records affecting 1 million people, including fingerprint data, facial recognition data, face photos of users, unencrypted usernames and passwords, and more. And when those kinds of records are stolen, it’s not like you can actually change the information. You can change a password or PIN code, but you can’t (practically speaking) change your fingerprints. In those cases, all you can do is participate in identity theft prevention programs that will at least prevent new accounts that involve things like credit checks from happening without lots of hoops to jump through. All credit agencies have these kinds of programs (here’s one from Experian, and they’ll communicate things like fraud alerts to the other big credit agencies like TransUnion), as do some government agencies.

What you can do in 5 minutes

  • Make sure that the devices that are using biometrics have an alternative way to access the device, like setting a passcode for when facial recognition doesn’t work.
  • Use that passcode every once in a while so you don’t forget it!

What you can do in 15 minutes

  • Sign up for a credit monitoring service that will keep an eye out for when and where your information might be exposed in a data breach so that you’ll know when to take further action to prevent accounts from being opened with your information.

What you can do in 30 minutes

  • Interested in really learning more about the challenges of biometrics? Read this fascinating pre-print study submitted to the IEEE Transactions on Technology and Society to learn more!

Wrap Up

Biometrics are pretty cool, and if your accounts are using them as part of your login process, that means you’re using multi-factor authentication (MFA). You get a gold star! But, alas, biometrics are not perfect and while your physical attributes are yours, once they have been turned into bits and bytes on a computer, they can be stolen and used. 

If you have a choice between biometric MFA or no MFA, go ahead with the biometrics. If you have a choice between some other factor–like an authenticator app–and biometrics, go with the authenticator app. No technology is perfect, so the goal is to make it harder for hackers to get to your accounts rather than impossible.

Good luck! It’s a crazy world out there.

Posted by heather in Data Security, Topic, Subject Level, Line Dancing, 0 comments