Spanish football club Atlético Osasuna introduced a facial recognition system for stadium access, sparking a GDPR complaint. The case highlights the challenges of biometric data processing, questioning its legality under the GDPR. The issue goes beyond simple convenience, raising concerns about proportionality, necessity, and fundamental privacy rights. Similar concerns arise when businesses upgrade traditional CCTV systems with facial recognition or other biometric technology, shifting them from standard surveillance tools to high-risk biometric processing. As seen in the Osasuna case, such implementations must meet strict legal requirements under GDPR to ensure compliance. As you will find in this blog post, the case serves as a good example for organizations considering biometric surveillance, highlighting key GDPR principles that must be addressed to avoid legal and financial risks.

The Case: What Happened?

In April 2022, Osasuna introduced a facial recognition system for stadium access as an alternative entry method to enhance security and speed up access for club members. Fans who wanted to use this system had to pre-register in advance by submitting a selfie, a scanned ID, and giving explicit consent. Once registered, they could enter the stadium using facial recognition instead of a traditional ticket. However, this system was optional, and fans could still choose to enter using regular ticket-based methods.

By November 2022, a formal complaint was filed with the Spanish Data Protection Authority (AEPD), arguing that the system violated GDPR regulations. Following this complaint, on December 2024, the AEPD published its decision, imposing a fine of 200,000 euros on Club Atlético Osasuna for violating GDPR rules. The club has announced its intention to appeal the decision before the National Court (Audiencia Nacional).

The GDPR Concerns

Consent: Not Enough to Justify Biometric Processing

Contrary to the club’s stance, the AEPD ruled that consent alone was insufficient to justify biometric processing. While users could opt out and use alternative entry methods, the authority concluded that the necessity and proportionality principles were not met. Since biometric data involves a high level of risk, organizations must not only have their explicit consent (Article 9 para. 1 GDPR), but also demonstrate that their approach is the least intrusive method to achieve their goal.

Processing of Sensitive Data: A Question of Necessity

The AEPD questioned whether scanning faces and verifying identities was genuinely necessary for stadium access, especially when QR codes and digital tickets could achieve the same result without intrusive data collection. Since the necessity test was not met, the justification for processing biometric data collapsed.

Data Minimization: A Core GDPR Principle Violated

The principle of data minimization requires that organizations collect only the data essential for their purpose. The AEPD found that facial recognition was excessive and disproportionate. Digital tickets and QR codes already provided secure, efficient access without the need for biometric verification. This made facial recognition an unnecessary invasion of privacy.

CCTV and GDPR: What Companies Need to Know

Many businesses rely on CCTV for security and monitoring, using it to deter crime, ensure workplace safety, and protect assets. Under the GDPR, traditional CCTV surveillance is generally considered a lower-risk form of data processing, provided that companies follow appropriate safeguards. This means that as long as businesses conduct proper impact assessments and ensure transparency by informing individuals about the presence and purpose of CCTV, its use can often be justified under legitimate interest or public security.

However, the Osasuna case highlights the regulatory challenges that arise when organizations introduce biometric identification into security systems. Just as Osasuna’s facial recognition system for stadium access was deemed excessive under GDPR—despite being introduced for security and convenience—businesses upgrading their CCTV with facial recognition or biometric analysis tools must prove that such measures are necessary and proportionate. The introduction of biometric data shifts CCTV surveillance into a high-risk category under GDPR, requiring stricter compliance measures, explicit justification, and alternative assessments before deployment.

Why Does This Matter?

This case serves as a warning that merely improving efficiency or security does not automatically justify biometric surveillance under GDPR. The use of biometric-enhanced CCTV is fundamentally different from traditional security cameras. While standard CCTV merely records footage for security and monitoring purposes, biometric-enabled systems go a step further—they actively identify and verify individuals in real-time.

One of the most critical factors is that biometric identification involves processing special category data under GDPR. Unlike regular CCTV footage, biometric data is considered highly sensitive because it directly relates to an individual’s unique physiological or behavioural characteristics. As a result, its use is subject to far more stringent legal requirements. Organizations must obtain explicit consent from individuals before processing their biometric data unless they can demonstrate that a clear exemption applies.

Additionally, businesses implementing biometric surveillance must conduct necessity and proportionality tests to determine whether such measures are justified. Unlike standard CCTV, which is often deemed proportionate for general security purposes, biometric surveillance requires a higher threshold of justification. Organizations must prove that there is no less intrusive way to achieve the intended security objective, and they must carefully balance their interests against individuals‘ rights to privacy.

The Key Takeaway

While traditional CCTV can be widely accepted under GDPR when accompanied by appropriate safeguards such as signage, data retention policies, and impact assessments, the integration of biometric technology within CCTV systems significantly escalates compliance obligations.

Lessons for Businesses and Organizations

The FC Osasuna case serves as a cautionary tale for companies considering biometric solutions:

  • Consent is not a silver bullet. Even when users opt in, necessity and proportionality remain crucial considerations.
  • Data minimization is essential. If a simpler, less intrusive method exists, organizations should adopt it instead.
  • Facial recognition demands strong justification. While security may be a valid rationale, mere convenience is unlikely to satisfy GDPR standards.
  • CCTV operators must reassess AI-powered surveillance. If facial recognition is used, compliance measures must be stringent.

This ruling of the AEPD signals that biometric surveillance will face increasing regulatory scrutiny. Businesses must rethink their reliance on facial recognition and consider whether alternative solutions can balance security with privacy.