Interview: Lotte Houwing on Facial Recognition

In this post we would like to point out the use of facial recognition and its influences on our society. For this article we have gotten the help of Lotte Houwing. She works for digital civil rights movement Bits of Freedom, and shared some of her insights with us.

Facial recognition technologies have been around for some years now, but it is not always clear where and to what extent these technologies are being used. Do you know how many facial recognition cameras are in use? Or who uses them, and to what purpose? Most likely, you don't. This is problematic, as facial recognition technologies are already very much ingrained in our daily lives! But before diving into this ethical discussion, let’s cover some of the basic workings of this technology:


 
pngaaa.com-1438332.png

Facial recognition as biometric data

Biometric data is“(…) personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person.” (“Art. 4 GDPR – Definitions” n.d.) These biometric identifiers can be divided in two categories: physical identifiers and behavioural identifiers which capture data that is unique to a person.

Physical identifiers are used for facial recognition through photo and video material and physiological recognition. Other physical identifiers include fingerprints, voice recognition, handwritten signatures and DNA. New technologies use these physical identifiers to an increasing extent. An example of personal use could be facial recognition or a fingerprint to unlock a smartphone.

But facial recognition in particular is also used as a security tool in supermarkets, subways, airports and arenas by governments, private companies or private people.

Remarkable is that it’s is not very complicated to set up a camera to identify people: Facial recognition is a technology that can be installed as software on a digital camera, which identifies the face as a set of points. When a face has been detected, it saves the data and tries to link it to a database that is connected to it. In that way it is able to link the face to already existing information it contains about this person (Waarlo and Verhagen 2020).

Hear Hoan Ton-That’s thoughts (founder and CEO of the app ClearviewAI)

As mentioned before, a popular use is that of security measures, to make sure a person is not in a place they are not allowed to be in. However, an aspect that is often critiqued is that this type of use is a way of surveilling people. Should it be normalized that we are recognized while walking down the street? And are we sure this data is deleted when it turns out it is unusable for the safety measures it was collected for?

To shed a bit of light on the technology and the issues surrounding it, we employed the help and expertise of Lotte Houwing by asking her some questions. Lotte works for Bits of Freedom, a Dutch NGO/digital civil rights movement that sticks up for digital rights of individuals within the European Union. This is where she works as a policy advisor and researcher, focusing mainly on state surveillance and the power relationship between the State and the citizen that goes with it. She examines the investigative powers of the police and the secret services and she is committed to protecting citizens from misuse of this power.

 

“What, according to you, are the most dangerous aspects of FR?”

“If you look at how our society is structured, there are a number of principles that underlie it: We live in a constitutional state, meaning; as a citizen you have quite a lot of freedoms. In principle you are allowed to do many things, except for things that are prohibited by law.

pngaaa.com-1438427.png

“So, the answer to the question "can I do this?" is often: “Yes, unless…” While for government it is the other way around: Within a constitutional state the government is only able to do certain things when it has been granted the power to do so by criminal law. Criminal law grants these powers in special circumstances. For example, when there is a crime and a reasonable suspicion.”

“However, when it comes to the introduction of mass surveillance means in the public space, this logic is reversed. In these cases, the government, for instance (but these could also be private companies), can keep an eye on citizens while there is no suspicion and, at the moment, not really any regulation either! It makes citizens more likely to be treated as suspects. Because people feel they are being watched, they will adjust their behaviour, and thus will be less free.”

The idea of criminal law is often that you may be investigated if there is reasonable suspicion that you are suspected of a criminal offence — but the idea with mass surveillance is that there is surveillance, without there having to be an offence.

When we are being spotted by a facial recognition technology, we are profiled and decisions about us are being made, without our knowledge. This could mean, for instance, that if a company has the intention of monitoring you as you walk into their store, they can start giving you personalized ads based on existing information about you that they have stored. For example, they stored what you bought in former visits, then they might now when your period is, and offer you discount to make you buy more chocolate. In such a case you become ‘a walking barcode’ as Lotte calls it, just waiting to be scanned to link your physical appearance to stored information.

Another issue that Lotte points out is that facial recognition does not work equally on every person because of different skin-colors. It has been pointed out that people of Asian and African descent do not get recognized as easily as people with white skin. On that basis the technology makes false matches and causes discriminating practices (Fernandez n.d.). But the inequality doesn’t just end there: By using facial recognition only in certain areas to detect crimes, it is also disproportionally distributed across society, and thereby intensifies existing inequalities to a high degree.

pngaaa.com-1438417.png

 

Regulation around the use of Facial Recognition

In Europe there are no rules for the use of facial recognition in particular, but much of it is covered by the GDPR, which states that biometric data is too sensitive to be used and shall be prohibited (‘Art. 9 GDPR – Processing of Special Categories of Personal Data’ n.d.), Lotte explains.“And, because it is a legal instrument, what follows is: “Unless, the exceptions in paragraph 2 apply, and paragraph 2 then lists ten exceptions as grounds on which it could be used.” From this it becomes clear that the reasons for using such sensitive data should be made explicit and cannot be used if the arguments for it are missing.

(‘Sensitive Personal Data - Special Category under the GDPR’ 2020)

(‘Sensitive Personal Data - Special Category under the GDPR’ 2020)

It is also allowed for Member States to introduce further conditions, including limitations, with regard to the processing of personal data. What can be seen in different cities around the world is that some simply ‘don’t accept’ these technologies in the public space. A great development in Lotte’s eyes: “There are currently a number of cities in Europe that say, ‘we are just not going to use facial recognition’, which is kind of like a ‘city ban’. Which I think is really cool!” On a national level, however, this is different: “The extent to which there is national legislation differs quite a bit, but as far as I know, no Member State yet has been strict about banning biometric surveillance in the public space.” Lotte wants to emphasize that on top of that the rules are still being bent by governments and police forces. As an example, she points to a Dutch ‘experiment’ by the police, the Amsterdam Arena, and the municipality of Amsterdam. The sensor cameras they use are monitoring the area around the arena to refuse troublemakers and hooligans (Waarlo and Verhagen 2020).

“What we see is that a lot is being done under the guise of "a pilot" or "testing ground". Something that implies a ‘temporary experiment’. For example, the police are very good at using new technology where they actually do not have a law that explicitly allows them to do so. Without a legal basis, but then they just say: ‘This is an experiment, pilot, or testing ground’. Of course, they don't have a legal basis yet because they don't know if they are going to implement it for real, so it is ‘temporary’. If you put it like that it all sounds very reasonable, but when it comes to surveillance what is said to be temporary still turns into something lasting.”


 

Positive uses of FR

What we can learn from the foregoing is that there are some problematic aspects bound to the use and the compliance of the GDPR. But nonetheless, the technology is still being implemented and further developed. We got curious and asked Lotte if there is also something to be said in favor of these technologies.

“As this technology is being developed further do you also see a positive perspective for its development?”

“Well, to be fair, and this is a pretty quick answer, I don't think so. I am very much against the use of it as a mass surveillance tool in the public space. However, outside of that I think there are uses of it that I think I am more neutral about. For example, I heard about the use of this technology to help elderly people who suffer from dementia to recognize their family and friends.”

An example of FR to help Alzheimers patients

“But in such a case there the conditions of consent and purely local storage of data need to be fulfilled. Another example is if people prefer to unlock their phone with their face than with their pin code, then I think ‘If that makes you happy, whatever floats your boat.’ I do not think I should make an issue out of that. But I don’t see it as something positive either, because I don't really see what it contributes to, I still see risks in it, and it also encourages the normalization of this type of technology: it becomes more common for us that technology is getting closer to our skin.”

A returning theme in the discussion of personal data is that of the lasting effects: One can share their data but is unfortunately not always 100% sure where it ends up. Thanks to the GDPR regulation is Europe we can now ask to have an insight into our own data, and have it removed if we want to. But what hasn’t been quite figured out yet are margins of the regulation that allow for it to be bent to the benefit of whoever wants to gather this sensitive information. In that was sometimes uses it has not been meant for can still be justified.


pngaaa.com-2330578.png

Future developments

Luckily there are many great initiatives (like BoF, and others mentioned at the bottom of this post) that are trying to develop the regulation around invasive technologies and help protect citizens from the misuse of technologies. Bits of Freedom is currently working on a file that is addressing Biometric surveillance technology in public space. Next to that as of today, the Reclaim Your Face coalition launches a European Citizens’ Initiative (ECI) today to ban biometric mass surveillance, in which Lotte also takes part.

“We want to broaden the discussion that is going on about ‘facial recognition’ to a discussion about biometric surveillance. We think that the problems that exist with facial recognition also exist with for example voice recognition. It is actually not about which part of your body is bombarded as a walking barcode, but that your body (any part or it) is bombed as a walking barcode to begin with! That's why we think it's important to broaden that discussion by making it less technology specific (…) we should not be talking about facial recognition, or about biometrics in general and its use, but its use as a surveillance tool in the public space because that has the greatest effect on individual rights and freedoms, the greatest impact in the context of discrimination and the greatest effect on our society. That is our focus, and we think it is very important that it is simply banned in the end, actually.”

“Because that's what it comes down to?”

“Yes, a regulation that does not end a ban actually legitimizes its application.”

What do you think?  

Is this FR something we should want? Does it make us safe and is it good to prevent crime from happening, or is it a breach of privacy and can it be branded as surveillance by the state? How do you feel about sharing your biometric data with institutions outside of your personal circle? Or would you wear something, that would prevent a camera with sensors from detecting you? Share your thoughts below!


- Robin Jane Metzelaar, 17/2/21.


Sources

Next
Next

The real products of social media