It has recently been argued that personal assistants like Amazon Echo, Apple Siri, Microsoft Cortana, and Google Home could violate children’s data privacy inasmuch as they record and store children’s voices and questions while processing their requests and keep such data on storage to “improve the customer’s experience”.

An article[1] published by the British Newspaper The Guardian on Thursday May 26th calls attention to the fact that this kind of devices could contravene the USA Children’s Online Privacy Protection Act[2] as long as their marketing strategies seem to suggest that they are aimed at families with small children.

Personal Assistant Devices under the USA Children’s Online Privacy Protection Act

The USA Children’s Online Privacy Protection Act (COPPA), in place since 2000, is aimed at the protection of children’s privacy through the requirement of verifiable parental consent for the collection of information of children under 13 on the part of websites.  The COPPA covers operators of websites directed to children under 13 and also, those directed to general audiences when they have “actual knowledge” that they’re collecting personal information from users under 13 years of age.  It also applies to operators, when they have “actual knowledge” they are collecting personal information from users of another site or online service directed to kids under 13, which means that under certain circumstances, the COPPA could also apply to third parties.

An explanatory note on the applicability of the COPPA rule, published by the Federal Trade Commission[3], explicitly explains that “the Rule doesn’t require operators of sites or services directed to general audiences to investigate the ages of its users, “however, asking for or otherwise collecting information that establishes that a visitor is under 13 triggers COPPA compliance”[4]. The explanatory note goes on to further explain that a website operator can have an “actual knowledge” under the Rule, based on “age identifying questions” like “what grade are you in?” or “what kind of school do you attend”?  The question in this case would be, therefore: can personal assistants also establish the age of the person based on the nature and complexity of their questions or their interactions with them?

In this case, there’s no need to answer such question as it can be established that these devices are aimed at an audience of children under 13 based on their marketing strategies.  As a matter of fact, by targeting the marketing strategies of their personal assistant devices to families with small children -as their advertising spots seem to suggest-  personal assistant devices’ operators could be found liable for contravening the COPPA, as long as this device not only records but also stores voice records of children while processing their instructions and queries.  It is important at this point to note that this kind of devices, while being plugged and operational, would also be capable of recording and processing voice records from anyone in the room -not only the children and their parents- which raises the question of whether or not it should be necessary to warn everyone within their reach about the presence of a gadget “witnessing” their conversations.

As a matter of fact, most of these personal assistant devices work in the “cloud,” which means that they don’t process queries by themselves but rely on an Internet connection to do so.  Privacy concerns include therefore, the fact that this kind of omnipresent devices are “always listening” and storing conversations or fragments of conversations in the “cloud” and that they also store information about location, in order to provide the customer with more accurate searches.  Users can, of course, press the mute button and disable the “listening” mode of the device and they can also delete voice recordings from their account.  Deleting voice recordings would, however, impact on the quality of the service, as voice records are stored in order for the device to “learn” from past commands or questions[5].  Another concern refers to the storing of the questions that users ask these devices, which, as long as they’re stored in the “cloud”, could be leaked and exposed.

Personal Assistant Devices under the General Data Protection Regulation.

Going back to children’s right to data protection and referring now specifically to European Data Protection Law, it’s worth noting that recital 38 of the new General Data Protection Regulation (GDPR) points out that, “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child…”

Furthermore, and perhaps even more applicable to this case, where the personal data protection situation may appear somehow unclear for the data subject, Recital 58 of the GDPR states that, “The principle of transparency requires that any information addressed to the public or to the data subject be concise, easily accessible and easy to understand, and that clear and plain language and, additionally, where appropriate, visualization be used. Such information could be provided in electronic form, for example, when addressed to the public, through a website. This is of particular relevance in situations where the proliferation of actors and the technological complexity of practice make it difficult for the data subject to know and understand whether, by whom and for what purpose personal data relating to him or her are being collected, such as in the case of online advertising. Given that children merit specific protection, any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand”. 

Likewise, while listing the categories of data and data processing activities that could particularly put at risk the rights and freedoms of natural persons, Recital 75 of the GDPR refers to the processing of “personal data of vulnerable natural persons, in particular of children” which, along with other articles of the GDPR (Art. 40, Art. 57, Art. 65, among others)  that specifically address the question of children’s right to data protection, should be enough to give an idea on how strict rules for the processing of their personal data will be under the new European data protection paradigm.

Specifically regarding parental consent, Article 8 of the GDPR -which constitutes an important development in comparison with Directive 95/46/CE- deals specifically with the “Conditions applicable to child’s consent in relation to information society services” in the following terms: “Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorized by the holder of parental responsibility over the child. Member States may provide by law for a lower age for those purposes provided that such lower age is not below 13 years. The controller shall make reasonable efforts to verify in such cases that consent is given or authorized by the holder of parental responsibility over the child, taking into consideration available technology”

It is clear, in sum, that whether under USA law or under European law, the processing of children’s personal data shall require parental consent, especially in cases -like the one at hand- where voice records, questions and interactions with children would be stored in the “cloud” and further processed.



[3] Federal Trade Commission, Children’s Online Privacy Protection Rule: Not Just for Kid’s Sites,

[4] Federal Trade Commission, Children’s Online Privacy Protection Rule: Not Just for Kid’s Sites,