San Francisco has just become the first U.S. city to ban facial recognition technology, to prevent discrimination and the inevitable curtailing of civil liberties that attends this type of artificial intelligence used by municipal agencies. Other cities are following suit, but despite this progress, the tech’s use is growing.
If you frequent airports, sports stadiums, malls or grocery stores, facial recognition technology may soon be a big part of your life — whether you like it or not.
Rather than check individual tickets, some airports are now using A.I. to scan faces as people pass the gates; if you’re paid up and your identity checks out, you’re allowed to board your flight. Convenient, right?
However, when the private sector uses our biometric data to discriminate their marketing tactics, we enter dangerous territory when it comes to the protection of your digital identity.
Malls have been caught using facial recognition cameras to guess your age, gender and even mood to advertise accordingly, luring you to certain stores or kiosks where you’re likely to spend money.
Even grocery stores can identify you in the aisles by your age and gender, displaying products on screens based on your marketing demographic.
What is biometric data?
Biometric data — fingerprints, retinal scans, gait recognition (the way you walk), voice recognition, DNA, facial scans — are unique to the person, and aim to quickly confirm your identity.
For individuals, the main benefits of using biometric data such as facial recognition are speed and convenience. You can avoid rummaging in your pockets for your concert or game tickets at a stadium. You can skip the lines, and just walk past scanning tech that can do the work instantly.
For corporations, the benefits are more to do with the ability to sway purchasing behaviour. And for governments, they get to monitor and control populations by combining biometric and other surveillance data with artificial intelligence.
Privacy concerns amid surveillance
The convenience of these technologies comes at a steep cost, especially regarding privacy. The most extreme example is China, where the government is known for abusing biometric data collection: they publicly shame people who jaywalk; they can capture facial scans and recognize citizens’ gaits to prevent those with a low social score from flying or from purchasing real estate; they can track anyone’s location at any time, often unfairly targeting ethnic or religious minorities.
Closer to home, Toronto has been piloting the Sidewalk Labs project, a data-driven smart city initiative that facilitates things like snow removal and traffic planning, and can curb crime by way of sophisticated security cameras. But because Sidewalk Labs have refused to de-identify people, privacy expert Dr. Ann Cavoukian and others have denounced it as little more than a data mine that could cause harm if that data is leaked or abused.
Glaring flaws in biometrics
Beyond surveillance, biometric identification has a major flaw: you can replace a compromised credit card, but if there’s a breach of your biometric data, you can’t change your face. Not easily, anyway!
There’s a big possibility of false positives, too. In London last year, the Metropolitan Police misidentified and fingerprinted a 14-year-old black boy, and figures reveal this kind of mistake is no anomaly; in fact, facial recognition software wrongly identified members of the public as criminals 96% of the time.
In its current iteration, facial scanning can also be racist and sexist; these technologies are prone to error when it comes to recognizing women and people of colour.
Yet another issue: It can be used to advertise to you without your permission in malls and grocery stores, even in taxis. And with all facial recognition in the public sphere, the individual can never be sure when or how their sensitive data is being used, or whether or where it’s being stored.
The cost of convenience
While there are obvious pros to facial recognition — such as increasing border security and facilitating police efforts to track down dangerous criminals — as a society we need to ask how much of our personal data we’re willing to sacrifice in the name of safety and convenience. If it’s becoming too much, we need to call on legislators to stand up for citizens’ privacy before we become even more accepting of surveillance tech and all the risks that go along with it.
Awareness training is the first step towards protecting your digital identity. Reach out to the Beauceron team to get informed on how our learning content can support your organization, firstname.lastname@example.org.