My kids are growing and so are the worries that I have regarding their use of technology and the internet. In my article, Technology and Children – U.S. Courts Place Injunctions on State Laws for Unconstitutionality, in October 2023, I discussed recent U.S. court injunctions against state laws aimed at protecting children online, citing constitutional concerns. In the article I emphasized the need for lawmakers to create more narrowly tailored regulations that can effectively protect children while respecting constitutional rights.

On July 30, 2024, a new attempt was made, this time by the federal government, to pass a law to protect children in their use of the internet, with the Senate passing the Kids Online Safety Act (KOSA).

KOSA Overview

KOSA aims to protect children online by enforcing transparency, empowering parents, and requiring platforms to take proactive measures against invasive tracking and predatory marketing practices. However, for a regulation to be truly effective, it must strike a careful balance — one that safeguards minors without stifling access to crucial information or infringing on fundamental rights. This requires a nuanced approach that not only addresses the immediate risks but also considers the broader implications for free speech and access to knowledge in a digital world.

KOSA not only addresses concerns about predatory marketing and content but also implicitly touches on broader data privacy issues. The tracking mechanisms that KOSA seeks to regulate are fundamentally tied to how companies collect and use personal data, particularly that of minors. This raises critical questions about who has access to our children’s data, how it is being used, and the potential risks associated with widespread data collection. In this digital age, protecting children online must also mean safeguarding their privacy.

Concerns About Children’s Technology Use

While KOSA aims to address these very issues, it mirrors the daily decisions I face as a parent, trying to balance protection with allowing my children the freedom to explore the digital world. As privacy counsel, I regularly advise clients on the delicate balance between protecting individual rights and embracing technological advances; as a parent, I find myself grappling with this same balance, only now the stakes feel even higher as I navigate the digital landscape alongside my own children. While I hope my clients carefully consider the rights and freedoms of their data subjects, as a parent, the struggle is more profound, with no clear sides to weigh. I want my children to be independent, self-confident, self-sufficient, intelligent, happy human beings who are safe from the harms of the world. When it comes to technology and the internet, this means balancing their desires — often to play or watch something on a screen — against the known effects of screen use, the need to gain essential knowledge and life skills, and the potential dangers of these tools.

Additionally, as privacy counsel, I constantly assess the risks associated with how companies collect and use personal data. This awareness directly shapes my parenting decisions. Currently my children are still too young to go online alone or to access social media platforms, but understanding the intricacies of data tracking makes me more vigilant about the apps and other technologies my children are using.

I believe that parents who are trying to balance all of these things are doing so with the knowledge that they have, but the general population cannot keep up with the changing technology used to track, market to and communicate with our children and youth. Therefore, I believe that it is important for the government to regulate those who are doing the tracking and marketing in a way that protects children when a parent may not be able to.

These concerns are not just held by parents throughout the world; they are shared by lawmakers who are now attempting to address them through legislation like KOSA. In Section 3 of KOSA a duty of care for online platforms is set forth stating:

(a) A covered platform shall act in the best interests of a user that the platform knows or reasonably should know is a minor by taking reasonable measures in its design and operation of products and services to prevent and mitigate the following:

(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.

(2) Patterns of use that indicate or encourage addiction-like behaviors.

(3) Physical violence, online bullying, and harassment of the minor.

(4) Sexual exploitation and abuse.

(5) Promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol.

(6) Predatory, unfair, or deceptive marketing practices, or other financial harms.

The concerns stated in the above section are echoed by some of the very people who helped create the platforms now under scrutiny. Technology executives have often expressed a need to protect children from the types of dangers that arise with the use of technology. Nick Bilton, a reporter at the New York Times once wrote, “I’ve met a number of technology chief executives and venture capitalists who say similar things: they strictly limit their children’s screen time, often banning all gadgets on school nights, and allocating ascetic time limits on weekends.” In the article he also states that the reason these tech executives make such rules is due to the fact that they understand what the use of technology can do to their children, referring to dangers including exposure to harmful content like pornography, bullying, and perhaps worse of all, becoming addicted to their devices.

It’s particularly telling that many of the very executives who design and promote these technologies impose strict limits on their own children’s screen time. Many others who have played great roles in the apps and social media platforms our society uses today have even expressed remorse for creating such technology. Sean Parker, former president of Facebook, expressed regret for the role he played in creating a product that he believes is „exploiting a vulnerability in human psychology.“. Roger McNamee, an early Facebook investor, stated in his book „Zucked: Waking Up to the Facebook Catastrophe,“ how platforms like Facebook are engineered to be addictive and how this negatively impacts mental health, particularly among children and teenagers. Tristan Harris, former Design Ethicist at Google, has gone so far as to co-found the Center for Humane Technology to promote the idea of „Time Well Spent,“ encouraging tech companies to focus on creating products that contribute positively to users‘ lives rather than simply maximizing screen time.

Beyond the immediate risks of addiction and harmful content, there’s an equally critical issue at play: data privacy. The same tracking mechanisms that lure children into spending more time online also harvest vast amounts of their personal information. While KOSA primarily focuses on preventing exposure to harmful content, it also indirectly addresses data privacy concerns by enhancing transparency and requiring platforms to take measures that could limit the misuse of minors‘ data for tracking and targeting purposes, thereby offering protection against both visible and invisible online threats. Protecting children online is not just about controlling what they see but also about controlling who sees them and how their personal information is used.

Arguments Against KOSA

Despite the acknowledgment of these risks by industry leaders, the solutions proposed by KOSA have not been universally accepted. Critics argue that the bill’s broad language could lead to unintended consequences. The ACLU, for example, warns that the duty of care requirement might push platforms to censor content that, while controversial, is vital for young people’s well-being, such as sexual health resources or information on gender identity. This tension between safeguarding minors and preserving free access to information highlights the challenges of regulating online spaces without infringing on First Amendment rights.

Balancing Protection and Access

It seems as though the Senate anticipated such an outcry and created Section 3 (b) stating:

(b) Limitation.— Nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude —

(1) any minor from deliberately and independently searching for, or specifically requesting, content; or

(2) the covered platform or individuals on the platform from providing resources for the prevention or mitigation of suicidal behaviors, substance use, and other harms, including evidence-informed information and clinical resources.

This tension between safeguarding children and ensuring their access to information is one I personally navigate daily as a parent. The desire to protect my children from online harms must be balanced against the plethora of advantages there are to having technology at our finger tips. One of these advantages, as the ACLU points out, is the ability to obtain information about topics that may feel to be taboo. I hope that I am creating a home environment where my children feel like they can ask me anything. For those questions which I do not have the answer or where my response may be merely my opinion of the answer we can go and find information together. But I also understand that not all families are living in such an environment. For those youth information is restricted and must be found elsewhere.

Conclusion

Ultimately, while KOSA aims to protect children, the debate highlights a deeper question: can we ensure safety online without compromising essential freedoms? As the legislation progresses, it is vital to consider not just the immediate impacts, but also the long-term implications for how we balance protection and access to information in an increasingly digital world.

The current situation in the US, is one of restricting access to information for children and young adults, from requiring parental consent for sex education classes to banning certain books from school libraries. The balance between protection and censorship continues to be a contentious issue, with significant implications for education, public health, and civil liberties.

As Tristan Harris, stated regarding the interview he had with 60 Minutes “Social Media and Political Polarization in America”, “Instead of talking about censorship or free speech or content moderation, we were talking about the real root of the problem, which is the engagement based business model.” Roger McNamee, in his book “Zucked”, argues that the „growth at any cost“ mentality in Silicon Valley has led to the development of products that prioritize engagement over users‘ well-being. KOSA seems to be focusing on these issues, and explicitly states in Section 3 (b) that this Act is not to be a limitation on information access.

In this article, I’ve explored the complexities of the Kids Online Safety Act (KOSA), highlighting its intentions to protect children from online dangers like invasive tracking and predatory marketing. However, the act also raises concerns about potential over-censorship and the broader implications for data privacy, free speech, and access to essential information. As a society, it’s crucial to balance these protections with the freedoms that allow children to thrive in the digital age. Businesses should change the way they are using technology to track, market to and communicate with our children and youth. As we await the fate of KOSA in the courts, the question remains: Can we truly protect our children without stifling their access to the information they need to thrive in a digital age?