biometric data

5 reasons FaceApp should give you worry lines

By now you’ve probably seen plenty of old versions of your friends’ faces pop up on social media. FaceApp is so popular that in just a few days the app has managed to collect the faces of more than 150 million people from around the world!  

Why are people so eager to snap and upload that selfie? Well, it’s fun and interesting to see these scarily accurate “future” faces. And Canadians still presume that laws and technological checks and balances are in place to protect their data.  

Really, though, it’s the Wild West on the internet, and once you surrender your information, you can't get it back.  

Here are our top 5 reasons to skip FaceApp: 

1) Your face is a biometric 

What are biometrics? It’s physical data that’s unique to you and used to identify you, such as your fingerprints, retina or iris scans, gait recognition (the way you walk), or voice recognition.  

Increasingly, we use our face to unlock our phones, to access services, we’re tracked by our faces through airport and other surveillance systems – and the potential for the loss or abuse of your biometric data is huge.  

2) The app is based in Russia 

The data is stored on Russian servers and is subject to Russian laws. This means Russian state intelligence agencies could gather this info. Remember when they tried to access Tinder users’ data? With FaceApp, users have already agreed to that data collection simply by creating an account and uploading a photo. 

3) They’re capturing your web browsing history 

Everything you search for, every website you visit, is viewable to FaceApp, until you uninstall the app. Yikes! 

4) Not to mention your location 

That location data can be used to pinpoint your whereabouts and target you with hyper-specific ads...and to gain insight into demographic trends for who knows what purpose. 

5) Your data could be stored indefinitely and used for reasons you can’t predict 

For example, this amazingly diverse data set could be used to train mass surveillance systems. It sounds far-fetched, but photos uploaded to Flickr and social media sites have already been scarfed up and used to teach A.I.s without people’s consent.  

Why should FaceApp be any different? 

If you’ve already downloaded the app, you should uninstall it ASAP, and make sure that you always read the terms of service before hopping on board with the newest trend. Remember that when you don’t pay for an app or service, you’re not the customer – you're the product

The onus is on the individual to protect sensitive information.  

To learn more about protecting your identity at home or at work, contact the Beauceron Security Team @ info@beauceronsecurity.com or 1-877-516-9245. 

Toronto cops using facial recognition to nab lawbreakers

Facial recognition may have been banned for agencies this month in San Francisco, but in the bustling Canadian city of Toronto, police are using the technology to generate leads in investigations as more and more crimes are caught on video.  

Officers have conducted more than 2,500 facial recognition searches since the half-million-dollar system was purchased in March of last year.  

How does it work?

Toronto police use artificial intelligence to compare any photo or video evidence they gather against a mugshot database. That evidence can include anything from government surveillance, to public or private enterprises’ footage captured on security cameras.  

This is a controlled use of facial recognition that has obvious benefits for the public. In fact, in the case of the Toronto police, they had 80% accurate matches against their mugshot database; 60% of the time they were able to rely on these matches to further investigations. Though it’s far from perfect, it’s much more accurate than other jurisdictions using this method of identification.  

Extreme uses of facial recognition A.I.

Not all police forces are using facial recognition in appropriate ways. The U.K. police are notorious for their Minority Report-esque way of fighting crime before it happens — using software that has a shocking 98% error rate, and that wrongly targets women and people of ethnic minorities.  

Beyond the Western world, China has some of the most extreme uses of facial recognition on the planet. An entire city — particularly the Muslim minority population living there — is monitored 24/7 for facial recognition as well as gait recognition (the unique way people walk).

This city had a database breach where all this biometric data leaked out, which illustrates one of the worst aspects of these types of A.I.-driven recognition tools: the fact that once it’s out there, anyone could copy that pattern and abuse it. You can’t change inherent aspects about yourself the way you can easily change a password. 

Part of the investigative tool set

Toronto police are describing the way they use facial recognition as “part of the investigative tool set” but not as conclusive evidence on its own — and this is a good approach. It’s not foolproof and involves more error than DNA testing, for example, but because it’s regulated and part of a holistic way of identifying culprits, the risk is limited. 

Public consent needed

Canadian politicians are finally waking up to the importance of controlling any technology that has the potential to infringe on citizens’ privacy rights. Democrat MP Charlie Angus is sounding the alarm on Capitol Hill about the perils of tech that tracks people, saying that as a country we need to discuss guidelines for the legitimate uses of surveillance.  

Ask the right questions

Privacy is all about control and consent. We need to ask the important questions about facial recognition: what level of civilian oversight is provided, how long will the photos and videos be stored, and when investigations are closed, when does the data go away? Being informed and asking the right questions can prevent dangerous uses and abuses of emerging technologies.  

To learn more about protecting your identity at home or at work, contact the Beauceron Security Team @ info@beauceronsecurity.com or 1-877-516-9245. 

Protecting your digital identity in the era of mass surveillance – before it’s too late

San Francisco has just become the first U.S. city to ban facial recognition technology, to prevent discrimination and the inevitable curtailing of civil liberties that attends this type of artificial intelligence used by municipal agencies. Other cities are following suit, but despite this progress, the tech’s use is growing.   

If you frequent airports, sports stadiums, malls or grocery stores, facial recognition technology may soon be a big part of your life — whether you like it or not.   

Rather than check individual tickets, some airports are now using A.I. to scan faces as people pass the gates; if you’re paid up and your identity checks out, you’re allowed to board your flight.  Convenient, right?  

However, when the private sector uses our biometric data to discriminate their marketing tactics, we enter dangerous territory when it comes to the protection of your digital identity.  

Malls have been caught using facial recognition cameras to guess your age, gender and even mood to advertise accordingly, luring you to certain stores or kiosks where you’re likely to spend money.   

Even grocery stores can identify you in the aisles by your age and gender, displaying products on screens based on your marketing demographic. 

What is biometric data?

Biometric data — fingerprints, retinal scans, gait recognition (the way you walk), voice recognition, DNA, facial scans — are unique to the person, and aim to quickly confirm your identity.   

For individuals, the main benefits of using biometric data such as facial recognition are speed and convenience. You can avoid rummaging in your pockets for your concert or game tickets at a stadium. You can skip the lines, and just walk past scanning tech that can do the work instantly.   

For corporations, the benefits are more to do with the ability to sway purchasing behaviour. And for governments, they get to monitor and control populations by combining biometric and other surveillance data with artificial intelligence. 

Privacy concerns amid surveillance

The convenience of these technologies comes at a steep cost, especially regarding privacy. The most extreme example is China, where the government is known for abusing biometric data collection: they publicly shame people who jaywalk; they can capture facial scans and recognize citizens’ gaits to prevent those with a low social score from flying or from purchasing real estate; they can track anyone’s location at any time, often unfairly targeting ethnic or religious minorities.   

Closer to home, Toronto has been piloting the Sidewalk Labs project, a data-driven smart city initiative that facilitates things like snow removal and traffic planning, and can curb crime by way of sophisticated security cameras. But because Sidewalk Labs have refused to de-identify people, privacy expert Dr. Ann Cavoukian and others have denounced it as little more than a data mine that could cause harm if that data is leaked or abused.  

Glaring flaws in biometrics

Beyond surveillance, biometric identification has a major flaw: you can replace a compromised credit card, but if there’s a breach of your biometric data, you can’t change your face. Not easily, anyway!   

There’s a big possibility of false positives, too. In London last year, the Metropolitan Police misidentified and fingerprinted a 14-year-old black boy, and figures reveal this kind of mistake is no anomaly; in fact, facial recognition software wrongly identified members of the public as criminals 96% of the time.  

In its current iteration, facial scanning can also be racist and sexist; these technologies are prone to error when it comes to recognizing women and people of colour.  

Yet another issue: It can be used to advertise to you without your permission in malls and grocery stores, even in taxis. And with all facial recognition in the public sphere, the individual can never be sure when or how their sensitive data is being used, or whether or where it’s being stored.

The cost of convenience

While there are obvious pros to facial recognition — such as increasing border security and facilitating police efforts to track down dangerous criminals — as a society we need to ask how much of our personal data we’re willing to sacrifice in the name of safety and convenience. If it’s becoming too much, we need to call on legislators to stand up for citizens’ privacy before we become even more accepting of surveillance tech and all the risks that go along with it.   

Awareness training is the first step towards protecting your digital identity. Reach out to the Beauceron team to get informed on how our learning content can support your organization, info@beauceronsecurity.com