Private Images: Should We Really Be Concerned about Our Privacy in a World with Facial Recognition?

Susan Morrow

By Susan Morrow . 8 August 2022

The human face is something we are programmed to recognize; the face has an evolutionary history that reflects how our species has changed over time.

But now, this essence of humanity, our face, has entered a new era, one of commoditization and control of systems based on their morphology: in other words, facial recognition.

But is facial recognition a digital dystopia or part of a natural digital evolution that we should all accept?

Should We Be Concerned About Facial Recognition

The Current State of Play in Facial Recognition

Our face is biometric data. As such, it requires special attention to privacy – this view is backed up by legislation, including GDPR and HIPAA, and many similar privacy laws. GDPR, Article 9, places biometric data into the special category of personal data. Article 4(14) specifically calls out facial data as biometric and specifies “relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person”.

Take a look at our GDRP fines list to see what can happen when companies do not follow European regulations.

Still, privacy considerations aside, facial recognition as a technology (FRT) is booming. Figures vary, but all see a growth in the market for FRT. Mordor Intelligence predicts a market value of facial recognition technology and services of $10.19 billion by 2025, growing at a CAGR of 12.5%.

Examples of FRT application across industry sectors show its potential wide-scope use:

Official Law enforcement

An obvious use of FRT is in law enforcement. Images of dystopian novels spring to mind, but facial recognition in policing is here and now. Facial recognition is used in many parts of the world for law enforcement, including Europe, the UK, and China.

The EU Commission is funding a project that will look at facial image-sharing exchange across EU borders. Europol is looking to become a global hub for biometric data sharing.

Georgetown Law, Center on Privacy and Technology report into Unregulated Police Face Recognition In America found that 50% of Americans are on some or other law enforcement face recognition system. The report’s concern is that the databases holding these biometric data are largely unregulated. Recommendations from the report include a better legislature only to allow searches based on a “reasonable suspicion standard” and that anyone found innocent should have their facial data removed from a database.

Border control

The use of facial recognition at borders is convenient and efficient. It is widely adopted in Europe, for example. In Australia, the government is looking to use biometric data to automate 90% of passenger border checks by 2020. Hong Kong and Singapore airports are also examples of the use of FRT for expediting passengers.

Schools and universities

China recently implemented facial recognition in schools. The technology was used for general safety reasons but was also used to track students and even analyze attentiveness in class. After some public pushback on the mysterious nature of such FRT applications, China is now pulling the technology out of general school use.

Verification and authorization

As digital identity becomes ubiquitous, biometric data associated with a person’s digital persona are associated. The data has the potential to become a sizeable critical infrastructure in its own right as the backbone of our digital persona. For example, face biometric data is being used for age verification. Biometric information is also used in transaction authorization and authentication.

Social media

Social media is perhaps one of the most lateral uses of facial recognition; many of us use it without even thinking about it. The tag suggestions setting in Facebook, for example, allowed you to tag friends and facial recognition matched images of a person across the platform. Privacy pressure has since resulted in Facebook offering greater management of this setting. However, our faces remain the theme across most social platforms.

Retailers

Several companies are offering FRT for the retail sector. The use of the technology in this context is to detect and deter shoplifters and prevent certain types of return fraud.

While there are obvious and positive uses of facial recognition technology, we must always ask: what about the privacy implications? Facial biometric data is, after all, our face in the world and one that arguably identifies us in ways that other personal data does not.

Facial Recognition: The Risks and Trade-offs

The risk list for facial recognition privacy is long; some of the contenders for biometric privacy risk of the year are:

Data breach

Biometric data, like any data, is at risk from cybercrime and privacy violations; the two are often intrinsically linked. In 2019, Suprema, which supplies government and financial sector clients, suffered a breach impacting almost 28 million data records, many containing biometric (face and fingerprint) data.

Misuse by authorities

Perhaps one of the most concerning areas of privacy violations associated with facial biometrics is at the state level. The tracking of ethnic minorities in China is an obvious example of how FRT can be misused. Human Rights Watch (HRW) described China’s use of FRT as “Algorithms of Suppression”. HRW describes how millions of ethnic minorities in China are being held in camps; their movements are tracked using FRT.

In other parts of the world, citizens show concern over the implementation of facial recognition by authorities. In the USA, several states have now banned the use of FRT. In California, students have protested about the use of facial recognition in universities for security purposes.

Safety (e.g. stalking and misuse)

The facial recognition apps such as Clearview AI and before that, the Russian Find face app, have been called out as a potential tool for stalkers. Clearview AI can be used to take a photo of a person in the street and quickly get back personal data, including address. Interestingly, the company behind Find face closed down the consumer app and turned their talents into a Russian surveillance system that constantly scans citizens’ faces.

Ownership of facial data

Actual ownership of facial images can come into question and be used to create a synthetic ID, fake social media accounts, and even as a feed for a deepfake scam. I have been a victim of a fake social media account scam. I had images of my face taken from another account and used to set up a new account, pretending to be me. The point was to extort money from me to take down the fake account – it didn’t work. While this scam does not explicitly use FRT, the widespread use of cheap FRT-based apps could propagate crimes of this nature.

Deepfakes can potentially use facial recognition to power cybercrime by using stolen face biometrics in scams.

Overuse

Just because you can doesn’t mean you should. There is a general overuse of technology for technology’s sake in the world. An example of this in the case of facial recognition is perhaps exemplified by ‘data overuse’ in the case of Google’s Nest Hub Max. Nest Hub uses the company’s Face Match technology to ‘improve product experience’. We need to ask whether we are prepared to share these data for an (hopefully) improved user experience.

We come back, time and again, to the idea of usability vs. privacy trade-off, which we have been dealing with in the world of security for many years.

Face-Off for Facial Recognition and Privacy

We are, it seems, at a juncture in the history of technology. We can’t get enough of the latest gadgets, and facial recognition is needed to power many AI-based devices and apps. But like many good things, it comes at a cost. Are we prepared to do a trade-off between privacy and technology? Are we happy to take the inherent risks in sharing our face, the deepest root of our humanity? I am torn. I am a gadget fan, but privacy and anonymity (in certain circumstances) are essential to me. Is privacy important to you?

Leave a Comment