The Spanish Data Protection Authority (AEPD) recently imposed a €500,000 fine on Fútbol Club Barcelona for failing to properly conduct a Data Protection Impact Assessment (DPIA) when implementing biometric systems used during the club’s membership census process.

This complex decision ultimately focuses on Article 35 GDPR, with the AEPD concluding that the club failed to carry out an adequate DPIA before deploying high-risk biometric processing.

This decision offers important lessons for organisations implementing identity verification systems or biometric technologies.

The Biometric Systems Used by FC Barcelona

To update the information of its members, the club implemented two biometric systems: A Facial recognition system and a voice recognition system.

Members could complete the census process remotely through an online platform. To verify their identity, the system required users to provide a facial scan using their device camera or record their voice, which was processed through a biometric recognition system.

More than 100,000 members ultimately completed the process through the online system, including tens of thousands of users whose voice was recorded for biometric verification.

While the club asked users to provide consent for the processing of their biometric data, the AEPD made clear that obtaining consent does not eliminate the obligation to conduct a DPIA where the processing is likely to involve high risks.

Under Article 35 GDPR, organisations must carry out a DPIA where processing is likely to result in a high risk to the rights and freedoms of natural persons. In this case, several high-risk criteria can clearly be identified, including:

  • the use of biometric data for identification
  • large-scale processing involving over 100,000 individuals
  • the use of new technologies
  • processing that included minors

The Authority’s Key Finding: The DPIA Lacked Essential Elements

Although the club did produce documentation intended to assess the risks of the system, the AEPD concluded that those documents could not be considered a valid DPIA.

The authority expressly stated that the documentation provided “lacked essential elements required to qualify” as a DPIA under Article 35 GDPR.

In particular, the decision shows the importance of ensuring that a DPIA:

  1. adequately assesses the risks associated with biometric processing; and
  2. includes a meaningful necessity and proportionality assessment.

A DPIA should not simply describe the technology used. It must also examine whether the processing is genuinely necessary to achieve the intended objective and whether less intrusive alternatives could reasonably have been used.

Practical Lessons for Organisations

This decision highlights several practical lessons for organisations implementing biometric or other high-risk technologies.

First, organisations should assume that biometric systems used for identification will very often require a DPIA.

Second, the DPIA must be substantive and detailed. A document that merely describes the system or lists security measures will not satisfy the requirements of Article 35 GDPR.

Third, organisations should ensure that the DPIA includes:

  • a clear description of the processing
  • a genuine assessment of necessity and proportionality
  • a detailed evaluation of risks to individuals
  • mitigation measures and an assessment of residual risk

Finally, organisations should remember that obtaining consent does not remove the obligation to carry out a DPIA when high-risk processing is involved.

For organisations planning to deploy biometric authentication, identity verification tools, or other high-risk technologies, early involvement of the Data Protection Officer (DPO) is critical.

A properly conducted DPIA can help organisations identify risks early, design more proportionate systems, and avoid regulatory enforcement later.