Online proctoring refers to the use of digital tools and technologies to remotely monitor students during online exams. This technology typically involves video and audio recording capabilities such as screen and web traffic recording, room recording, periodic desk scans and sometimes methods such as biometric recognition to reduce the potential for academic dishonesty and maintain the integrity of online examinations. The data collection and processing methods vary significantly depending on the specific software in use. However, a common characteristic among all of them is their reliance on AI technology to monitor students, primarily to assist proctors in detecting instances of cheating. This determination is made by automatically calculating the probability of cheating based on the captured behavior of the student. While this allows educational institutions to offer remote examinations, it raises significant data protection concerns, particularly in countries with stringent privacy laws like Germany and France.
Regulatory Framework in Germany especially in Bavaria
In Germany, there is no uniform practice regarding remote exams. The federal states have different requirements and conditions e.g. Baden-Württemberg (Art. 32a Landeshochschulgesetz). In this article, we will analyse online proctoring in Bavaria, which is allowed only under strict conditions.
In Bavaria, online proctoring is governed by several legal provisions, including the Bavarian Distance Learning Examination Ordinance (BayFEV) and Germany’s Basic Law (Grundgesetz (GG)). According, to the Ordinance, online exams can only be offered as an alternative to face-to-face examinations, ensuring that students are never forced to undergo online proctoring if they prefer an in-person exam (Section 6 para. 4 of the BayFEV). In other words, automated video proctoring is only permissible if remote exams are offered as an alternative solution in case there is a lack of proctoring staff available to carry out the supervision and the students have expressly given their consent. Students shall be informed about the automated video supervision and the alternative option of a face-to-face examination before they provide their consent.
A cornerstone of data protection in online proctoring in Bavaria is the prohibition of room surveillance during exams. For instance, while students may be asked to briefly show their work area via camera to ensure no illicit aids are being used, continuous room monitoring is deemed a disproportionate invasion of privacy unless there is concrete suspicion of cheating (Section 6 para. 1 of the BayFEV).
Bavaria also places strict limits on the use of automated proctoring tools. AI-driven monitoring, such as automated facial recognition or behavioural analysis, is considered excessively intrusive and is generally prohibited. The focus instead is on human supervision to ensure proportionality and necessity, reflecting the view that AI-based systems are not justified when human supervision is available (Section 6 para. 2 of the BayFEV). The overriding principle is proportionality – ensuring that any data collection or processing is strictly necessary for the legitimate purpose of conducting the examination. Moreover, data collected during the examination, such as video and audio streams, must be deleted immediately after the exam is concluded, except in cases where cheating is suspected (Section 5 para. 2 BayFEV).
Regulatory Framework in France
In September 2023, the French Data Protection Authority (CNIL) issued recommendations (Deliberation No. 2023-058) for remote monitoring tools used in online exams, which emphasize the need for transparency, proportionality, and the protection of students’ privacy.
A fundamental aspect of French data protection law is that students must be offered an alternative to online proctoring – namely, a face-to-face exam unless there are exceptional circumstances, such as during a health crisis or in cases where the entire institution is based on a remote learning model. Transparency is a key requirement, with institutions obliged to inform students about the monitoring systems being used, the types of data collected, and how to mitigate incidental data collection risks in accordance with Section 82 of the Data Protection Act. This reflects concerns that proctoring solutions might inadvertently collect data on students‘ relatives or their surroundings, violating their privacy rights. Institutions organizing examinations, as well as their possible subcontractors, should guarantee candidates that their data will not be used for any purpose other than the taking and monitoring of a remote examination.
Processing personal data for online proctoring according to the CNIL guidelines is typically grounded either in the legal basis of public interest (Article 6 para. 1 lit. e of the GDPR) or in the contractual necessity of conducting the examination (Article 6 para. 1 lit. b of the GDPR). Consent is another potential basis, but only when a face-to-face alternative is provided. Importantly, consent must be capable of being withdrawn at any time, allowing students to opt out without negative consequences if they choose to switch to a physical exam format. Similarly, legitimate interest implies the possibility of objecting to processing, which appears difficult to manage in the context of organising an examination (Art. 2 of the Deliberation No. 2023-058).
Biometric data processing is also subject to strict regulation in France. While biometric verification, such as facial recognition, can be used to confirm a student’s identity, this must be carefully controlled. For example, any biometric processing must be based either on the student’s consent or on public interest (Article 9 GDPR). If the consent of the students is requested an alternative solution should be offered e.g. to arrive at the exam center earlier for document identification. In the case of public interest, it must be strictly regulated by law and human intervention must be possible if the student has difficulty in the authentication process. Devices involving biometric data processing should not be used for purposes other than identity verification and the solutions should not result in the creation of databases of biometric templates.
Much like in Germany, France limits the use of AI-driven tools to flag suspicious behaviour. The CNIL points out that while systems that detect anomalies in a student’s environment, like an unexpected third party entering the room or detection of an abnormal noise level, may be allowed, automatic behavioural analysis – such as tracking eye movements or analysing emotions – is generally prohibited. Any automated system that flags suspicious activity must involve human oversight to confirm potential violations before any action is taken.
Comparative Analysis
While both France and Germany (Bavaria) have robust data protection frameworks shaped by the GDPR, there are nuances in their approaches to online proctoring. Both countries stress the importance of providing students with alternatives to online proctoring, ensuring that participation is voluntary and informed. However, Bavaria’s guidelines place a stronger emphasis on prohibiting AI-driven surveillance systems, favouring human oversight, while France permits some limited use of automated tools, provided that human verification is always involved.
Moreover, France is slightly more lenient regarding the use of biometric data, as long as strict safeguards are in place, whereas Bavaria avoids biometric verification in online proctoring altogether. Both however, share a commitment to minimizing data retention and ensuring that no recordings are kept beyond what is absolutely necessary for detecting fraud. In both jurisdictions, the transparency of institutions is paramount. Universities must clearly communicate their methods and provide students with sufficient information to make informed decisions about their participation in remote exams.
While their legal frameworks differ slightly, both countries prioritize the protection of students‘ personal data and privacy. The overarching principle in both cases is proportionality – ensuring that any intrusion into students‘ private lives is necessary and limited to the scope of ensuring exam integrity. As online education and remote assessments continue to grow, these data protection frameworks will likely evolve further to address the challenges posed by emerging technologies.