Clearview AI (hereafter Clearview) is a company dedicated to facial recognition. Clearview created a data base of over 20 billion images indexed from the Internet and social media and developed a facial recognition algorithm to analyze and match faces.  Clearview compares sample images provided by its clients (mostly law enforcement bodies, governments, and banks) with its database and provides its clients with an indexed list of images with similar characteristics. The results include the link to the URL where the image appears online. Clearview promotes 99 percent accuracy on all demographic data.

How did Clearview obtain images?

According to Clearview, their database is based on images posted on digital platforms and websites. If you are on LinkedIn or have been tagged on Instagram or Facebook, chances are you are on Clearview database. The images are obtained by a technique called data scraping.

How it works?

Clearview developed an algorithm that obtains a set of facial vectors from the sample image provided by its customers and compares it to the facial vectors extracted from the images in Clearview’s database. The algorithm is designed to account for age progression, variations in posture and position, changes in facial hair, and special demographic group parameters.

Why were they sanctioned by the ICO?

Below, we explain why the ICO found Clearview to be in breach of the UK GDPR resulting in a €9 million fine.

The processing carried out by Clearview is illegal. The analysis and comparison of facial vectors for the purpose of determining a person’s identity is processing of biometric data under the GDPR/UK GDPR. Biometric data is considered a special category of personal data (Art. 4 Para. 14, GDPR/UK GDPR) that can only be processed under certain exceptions (Art. 9 Para. 2, UK GDPR ), such as the explicit consent of the data subject. The ICO found that Clearview does not comply with any of these, violating the rules governing this matter.

Individuals are unaware of the processing of their personal data. Clearview suggests that it only uses images that have been published on the internet and can be used for any purpose. The ICO rejects this argument and recalls the obligations Clearview has when the information is not obtained directly by the data subject (Art. 14, UK GDPR ) and the fact that the processing is carried out on biometric data with special requirements as indicated above. Moreover, Clearview does not explain what happens with respect to images of persons uploaded by third parties or images that the data subject initially configured as public and later changed as private.

Images do not appear to be deleted. Clearview does not have a data retention policy and did not indicate when it deletes images from its database. In the ICO’s view, Clearview cannot guarantee that personal data is not retained for longer than necessary (Art. 5 Para. 1 Lit. e, UK GDPR ). This issue raises concerns considering that in December 2021 the French ombudsman indicated that Clearview had collected 10 billion images, currently Clearview indicates that it has more than 20 billion and according to the Washington Post, the company is telling investors the goal is collecting 100 billion facial photos within a year.

Hindered data subject’s rights. The rights are restricted in the first place by the lack of information outlined above. In addition, the exercise of the rights is conditioned on data subjects providing a photograph of themselves. In the ICO’s view this discourages the exercise of the rights since many data subjects want Clearview not to process their personal data.

Lack of an Impact Assessment. Clearview’s processing requires a Data Protection Impact Assessment for several reasons, but especially for processing on a large scale of special categories of data. In addition, it appears that Clearview even processes the personal data of children.

Clearview had already been fined for similar violations by the Italian Ombudsman, Greek Ombudsman, and in total the company has been fined over €49 million by the European data protection authorities.

If you need assistance with the application of new technologies or the large-scale processing of special categories of personal data (e.g., genetic data, biometric data, health data, etc.), we are at your disposal and can be reached at